String(x) puts "optional" in the output - swift

I'm writing a little program that reads and writes text files into an NSTableView. I had the reading working fine, and then moved onto the writing. And I got...
FR Optional(0) Optional(0) Optional(0) Optional(0) Optional(46.29) Optional(0)
I understand why this is happening: the values are NSNumbers in a dictionary, so they are, by definition, optional. But obviously this is not useful output.
Is there an easy way to output the value without the Optional and any similar bumpf?

The reason why you see Optional(n) is because you're printing the optional without unwrapping.
I suggest you re-read the optionals chapter in the Apple book to get a better grasp at why this is happening to you.
The short version is, an Optional is a type, in fact if you look at the Swift source code, you will find that's it's just an Enum!
public enum Optional<Wrapped> : _Reflectable, NilLiteralConvertible {
case None
case Some(Wrapped)
/// Construct a `nil` instance.
public init()
/// Construct a non-`nil` instance that stores `some`.
public init(_ some: Wrapped)
/// If `self == nil`, returns `nil`. Otherwise, returns `f(self!)`.
#warn_unused_result
public func map<U>(#noescape f: (Wrapped) throws -> U) rethrows -> U?
/// Returns `nil` if `self` is nil, `f(self!)` otherwise.
#warn_unused_result
public func flatMap<U>(#noescape f: (Wrapped) throws -> U?) rethrows -> U?
/// Create an instance initialized with `nil`.
public init(nilLiteral: ())
}
So if your values are Optionals, you have to unwrap them to see their values.
Take a look at this code and try to guess the output:
var code: String? = "hello"
if let code = code where code == "hello" {
print(code)
}
var upperCase = code?.uppercaseString
print(upperCase)
Output:
Did you figure it out?
It looks like this:
hello
Optional("HELLO")
Why does the first hello print ok?
Because it's being unwrapped in the if let statement.
But the second one, upperCase is never unwrapped, so an Optional will remain optional (unless unwrapped). code?.uppercaseString returns a new Optional. By printing it directly, we get what you see.
If you want to extract the value the optional is holding you have two operators. ? and !.
The first one is usually preferred, but if you're sure that the value is ok, you can use force the unwrap by using !, so you could do
print(uppercase!)
Careful tho, because if upperCase happens to be nil, you'd get a runtime crash.

If the values you write are held in an (optional) array, then .flatMap, prior to printing, will do the trick for you
// myOptionalDoubles : [Double?]
myOptionalDoublesForWriting = myOptionalDoubles.flatMap { $0 }
Using .flatMap like this, you unwrap any non-nil values in the array myOptionalDoubles, and return an array myOptionalDoublesForWriting of type [Double] (non-optional). Any nil entries in myOptionalDoubles will not follow to myOptionalDoublesForWriting.
If you don't want to lose information about optional entries, you can use .map instead:
// myOptionalDoubles : [Double?]
var myOptionalDoublesForWriting = myOptionalDoubles.map { $0 ?? 0.0 }
In this case, you make use of the nil coalescing operator to either unwrap each (non-nil) array entry, or, if it's nil, give it value 0.0. As above, the mapped to array myOptionalDoublesForWriting will be of non-optional type [Double].

Related

Forcing an Encoder's UnkeyedEncodingContainer to only contain one type of value

As part of a custom Encoder, I am coding an UnkeyedEncodingContainer. However, the specific format I am making it for asks that all elements of an array be of the same type. Specifically, arrays can contain :
Integers one same size
Floats or Doubles
Other arrays (not necessarily all containing the same kinds of elements)
Objects
Here is the type of answer I need : The basis of an UnkeyedEncodingContainer implementation that conforms to the protocol, and enforces that all elements be of one same type among the above specified ones.
As requested, here are examples of things that should or should not be encodable :
var valid1 = []
var valid2 = [3, 3, 5, 9]
var valid3 = ["string", "array"]
var invalid1 = [3, "test"]
var invalid2 = [5, []]
var invalid3 = [[3, 5], {"hello" : 3}]
// These may not all even be valid Swift arrays, they are only
// intended as examples
As an example, here is the best I have come up with, which does not work :
The UnkeyedEncodingContainer contains a function, checkCanEncode, and an instance variable, ElementType :
var elementType : ElementType {
if self.count == 0 {
return .None
} else {
return self.storage[0].containedType
}
}
func checkCanEncode(_ value : Any?, compatibleElementTypes : [ElementType]) throws {
guard compatibleElementTypes.contains(self.elementType) || self.elementType == .None else {
let context = EncodingError.Context(
codingPath: self.nestedCodingPath,
debugDescription: "Cannot encode value to an array of \(self.elementType)s"
)
throw EncodingError.invalidValue(value as Any, context)
}
}
// I know the .None is weird and could be replaced by an optional,
// but it is useful as its rawValue is 0. The Encoder has to encode
// the rawValue of the ElementType at some point, so using an optional
// would actually be more complicated
Everything is then encoded as a contained singleValueContainer :
func encode<T>(_ value: T) throws where T : Encodable {
let container = self.nestedSingleValueContainer()
try container.encode(value)
try checkCanEncode(value, compatibleElementTypes: [container.containedType])
}
// containedType is an instance variable of SingleValueContainer that is set
// when a value is encoded into it
But this causes an issue when it comes to nestedContainer and nestedUnkeyedContainer : (used for stored dictionaries and arrays respectively)
// This violates the protocol, this function should not be able to throw
func nestedContainer<NestedKey>(keyedBy keyType: NestedKey.Type) throws -> KeyedEncodingContainer<NestedKey> where NestedKey : CodingKey {
let container = KeyedContainer<NestedKey>(
codingPath: self.nestedCodingPath,
userInfo: self.userInfo
)
try checkCanEncode(container, compatibleElementTypes: [.Dictionary])
self.storage.append(container)
return KeyedEncodingContainer(container)
}
As you can see, since I need checkCanEncode to know whether it is even possible to create a NestedContainer in the first place (because if the array already has stuff inside that aren't dictionaries, then adding dictionaries to it is invalid), I have to make the function throw. But this breaks the UnkeyedEncodingContainer protocol which demands non-throwing versions.
But I can't just handle the error inside the function ! If something tries to put an array inside an array of integers, it must fail. Therefore this is an invalid solution.
Additional remarks :
Checking after having encoded the values already feels sketchy, but checking only when producing the final encoded payload is definitely a violation of the "No Zombies" principle (fail as soon as the program enters an invalid state) which I would rather avoid. However if no better solution is possible I may accept it as a last resort.
One other solution I have thought about is encoding the array as a dictionary with numbered keys, since dictionaries in this format may contain mixed types. However this is likely to pose decoding issues, so once again, it is a last resort.
You will be advised not to edit other people’s questions. If you have edits to suggest please do so in the comments, otherwise mind your own business
Unless anyone has a better idea, here is the best I could come up with :
Do not enforce that all elements be of the same type inside the UnkeyedEncodingContainer
If all elements are the same type, encode it as an array
If elements have varying types, encode it as a dictionary with integers as keys
This is completely fine as far as the encoding format goes, has minimal costs and only slightly complicates decoding (check whether keys contain integers) and greatly widens how many different Swift object will be compatible with the format.
Note : Remember that the "real" encoding step where the data is generated is not actually part of the protocol. That is where I am proposing the shenanigans should take place 😈
I don't know whether this helps you but in Swift you can overload functions. This means that you can declare functions with the same signature but with different parameter types or constraints. The compiler will take always the right choice. It's much more efficient than a type check at runtime.
The first method is called if the array conforms to Encodable
func encode<T: Encodable>(object: [T]) throws -> Data {
return try JSONEncoder().encode(object)
}
Otherwise the second method is called. If the array cannot even be encoded with JSONSerialization you can add custom encoding logic.
func encode<T>(object: [T]) throws -> Data {
do {
return try JSONSerialization.data(withJSONObject: object)
} catch {
// do custom encoding and return Data
return try myCustomEncoding(object)
}
}
This example
let array = [["1":1, "2":2]]
try encode(object: array)
calls the first method.
On the other hand this – even if the actual type is not heterogenous – calls the second method
let array : [[String:Any]] = [["1":1, "2":2]]
try encode(object: array)

Why does the nil-coalescing operator return an optional?

I use Swift's nil-coalescing operator in a chain to print one of three values. The first two are optional, the last one is non-optional:
let optionalString: String? = nil
let optionalError: Error? = MyError.anError
let defaultString: String = "default"
print(optionalString ?? (optionalError ?? defaultString))
I was surprised to see that this code prints
Optional(MyError.anError)
to the console. I was expecting a non-optional:
MyError.anError
Why is this an optional?
The error is defined as follows:
enum MyError: Error {
case anError
}
I know that there are two implementations of the nil-coalescing operator:
func ?? <T>(optional: T?, defaultValue: #autoclosure () throws -> T) rethrows -> T
and
func ?? <T>(optional: T?, defaultValue: #autoclosure () throws -> T?) rethrows -> T?
According to this, ?? should return an optional (T?) only if the defaultValue is an optional as well (T?). But this is not the case:
The default value for the first ?? operator is (optionalError ?? defaultString). To evaluate this value's type, we need to evaluate the result of the second ?? operator. As defaultString is of type String and thus a non-optional, function (1) is used for the nil-coalescing operator. Hence, the result of (optionalError ?? defaultString) must be non-optional as well, which is the default value for the first ?? operator. So again, function (1) is used for the nil-coalescing operator and thus, optionalString ?? (optionalError ?? defaultString) should return a *non-*optional.
Apparently, there is a flaw in my logic, but I don't see it. Any explanation for this?
Disclaimer: I wasn't sure if I should add this as a supplement to my question or if I should post it as an answer. I decided for the latter. It's not a bullet-proof explanation, but I think it's "close enough" for many who might run into the same problem.
What's going wrong: The Big Picture
The two comments stating that the code provided doesn't compile for the commentators pushed me in the right direction:
I copied the code in a blank playground and ran it. For me, it did compile, but it also showed a warning which it didn't show before (because for some reason, Auto completion and compiler warnings were broken in the original project).
The warning
Expression implicitly coerced from 'Any?' to 'Any'
tells me that the expression (optionalError ?? defaultString) is apparently evaluated as type Any?, which is an optional. The compiler then chooses the ?? operator function (1) to evaluate the entire expression which requires that its default value (optionalError ?? defaultString) is non-optional. Thus, this value is implicitly coerced from Any? to Any. And that's why Optional(MyError.anError) is printed.
Why is this going wrong?
There are two questions remaining:
Why does the compiler choose operator function (1)?
Why is (optionalError ?? defaultString) evaluated to Any??
I don't have an answer for question 1,
but my guess would be that the compiler simply assigns a higher precedence to function (1) when the parameter types don't match and it's unclear which function to use. (I'll be happy to be corrected.)
Regarding question 2:
optionalError is of type MyError? and defaultString is of type String. The signature of the ?? operator functions read:
func ?? <T>(optional: T?, defaultValue: #autoclosure () throws -> T) rethrows -> T
func ?? <T>(optional: T?, defaultValue: #autoclosure () throws -> T?) rethrows -> T?
So both parameters, optional and defaultValue must have the "same" generic type (only that in the first case, one parameter is an optional of that particular type and the other one is not). MyError and String are obviously not the same type. So in order for this to match the criteria of the function signature, the compiler must coerce both types to Any.
So MyError? is coerced to Any? and apparently, String is also coerced to Any? and thus, function (2) is used which evaluates (optionalError ?? defaultString)toAny?`.
That again raises the question why function (2) is used in this case which kind of contradicts my assumption for question (1), but my main take-away here is that it's simply a bad idea to use two different types with the nil-coalescing operator.
Background:
I got this code example from a textbook on RxSwift which is slightly different as it uses a generic type that is required to conform to CustomStringConvertible:
func print<T: CustomStringConvertible>(label: String, event: Event<T>) {
print(label, event.element ?? event.error ?? event)
}
What got me experimenting with this in the first place was that when I ran the code from the book it printed an optional whereas the book claimed that it would print a non-optional.

Optional chaining with Swift strings

With optional chaining, if I have a Swift variable
var s: String?
s might contain nil, or a String wrapped in an Optional. So, I tried this to get its length:
let count = s?.characters?.count ?? 0
However, the compiler wants this:
let count = s?.characters.count ?? 0
My understanding of optional chaining is that, once you start using ?. in a dotted expression, the rest of the properties are made optional and are typically accessed by ?., not ..
So, I dug a little further and tried this in the playground:
var s: String? = "Foo"
print(s?.characters)
// Output: Optional(Swift.String.CharacterView(_core: Swift._StringCore(_baseAddress: 0x00000001145e893f, _countAndFlags: 3, _owner: nil)))
The result indicates that s?.characters is indeed an Optional instance, indicating that s?.characters.count should be illegal.
Can someone help me understand this state of affairs?
When you say:
My understanding of optional chaining is that, once you start using ?. in a dotted expression, the rest of the properties are made optional and are typically accessed by ?., not ..
I would say that you are almost there.
It’s not that all the properties are made optional, it’s that the original call is optional, so it looks like the other properties are optional.
characters is not an optional property, and neither is count, but the value that you are calling it on is optional. If there is a value, then the characters and count properties will return a value; otherwise, nil is returned. It is because of this that the result of s?.characters.count returns an Int?.
If either of the properties were optional, then you would need to add ? to it, but, in your case, they aren’t. So you don’t.
Edited following comment
From the comment:
I still find it strange that both s?.characters.count and (s?.characters)?.count compile, but (s?.characters).count doesn't. Why is there a difference between the first and the last expression?
I’ll try and answer it here, where there is more room than in the comment field:
s?.characters.count
If s is nil, the whole expression returns nil, otherwise an Int. So the return type is Int?.
(s?.characters).count // Won’t compile
Breaking this down: if s is nil, then (s?.characters) is nil, so we can’t call count on it.
In order to call the count property on (s?.characters), the expression needs to be optionally unwrapped, i.e. written as:
(s?.characters)?.count
Edited to add further
The best I can get to explaining this is with this bit of playground code:
let s: String? = "hello"
s?.characters.count
(s?.characters)?.count
(s)?.characters.count
((s)?.characters)?.count
// s?.characters.count
func method1(s: String?) -> Int? {
guard let s = s else { return nil }
return s.characters.count
}
// (s?.characters).count
func method2(s: String?) -> Int? {
guard let c = s?.characters else { return nil }
return c.count
}
method1(s)
method2(s)
On the Swift-users mailing list, Ingo Maier was kind enough to point me to the section on optional chaining expressions in the Swift language spec, which states:
If a postfix expression that contains an optional-chaining expression is nested inside other postfix expressions, only the outermost expression returns an optional type.
It continues with the example:
var c: SomeClass?
var result: Bool? = c?.property.performAction()
This explains why the compiler wants s?.characters.count in my example above, and I consider that it answers the original question. However, as #Martin R observed in a comment, there is still a mystery as to why these two expressions are treated differently by the compiler:
s?.characters.count
(s?.characters).count
If I am reading the spec properly, the subexpression
(s?.characters)
is "nested inside" the overall postfix expression
(s?.characters).count
and thus should be treated the same as the non-parenthesized version. But that's a separate issue.
Thanks to all for the contributions!

Initialize local variable inside closure

Is it possible to initialize a variable in a closure? Specifically, the following code gives an error:
func doSomething(todo: (Void -> Void)) -> Void {
todo()
}
var i: Int
doSomething( { i = 3} )
print(i)
because i is captured before being initialized. Of course, I can always just use default initialization for the variable and skipping that is most of the time a microoptimization, but I'm wondering.
Edit:
In the end I've gone with an implicitly unwrapped optional var i: Int!,
thanks to #Laffen and #dfri for pointing me in the right direction. Using an optional should be the best way in the majority of cases.
If no default value is set for the initialization, you should append the ? to make this an optional.
var i: Int?
func someFunc(){
i = 1
}
Note: To enhance the readability of this answer together with the comment made by #dfri, I've included the comment in the answer:
Perhaps worth mentioning that i is now an implicitly unwrapped optional, so it's possible for i to take a value nil, whereafter trying to access nil-valued i will not prompt compile time warning, however yielding a runtime exception. E.g., i=nil ... print(i). For this case, I find it safer to let i be a "regular" optional, in which case compiler will prompt you for unwrapping (and you can do this in a safe manner rather than implicitly using forced unwrapping ! by default: e.g. var i: Int? .... print(i ?? "nil")), or, safely print "nil" if not unwrapped.
If you move the print(i) inside the closure then it works.
func doSomething(todo: (Void -> Void)) -> Void {
todo()
print(i)
}
var i: Int
doSomething( { i = 3} )
If the value is not mutated outside the closure Swift will make a copy of that value and so the i referred to outside the doSomething is different from the one inside the doSomething.

How to handle initial nil value for reduce functions

I would like to learn and use more functional programming in Swift. So, I've been trying various things in playground. I don't understand Reduce, though. The basic textbook examples work, but I can't get my head around this problem.
I have an array of strings called "toDoItems". I would like to get the longest string in this array. What is the best practice for handling the initial nil value in such cases? I think this probably happens often. I thought of writing a custom function and use it.
func optionalMax(maxSofar: Int?, newElement: Int) -> Int {
if let definiteMaxSofar = maxSofar {
return max(definiteMaxSofar, newElement)
}
return newElement
}
// Just testing - nums is an array of Ints. Works.
var maxValueOfInts = nums.reduce(0) { optionalMax($0, $1) }
// ERROR: cannot invoke 'reduce' with an argument list of type ‘(nil, (_,_)->_)'
var longestOfStrings = toDoItems.reduce(nil) { optionalMax(count($0), count($1)) }
It might just be that Swift does not automatically infer the type of your initial value. Try making it clear by explicitly declaring it:
var longestOfStrings = toDoItems.reduce(nil as Int?) { optionalMax($0, count($1)) }
By the way notice that I do not count on $0 (your accumulator) since it is not a String but an optional Int Int?
Generally to avoid confusion reading the code later, I explicitly label the accumulator as a and the element coming in from the serie as x:
var longestOfStrings = toDoItems.reduce(nil as Int?) { a, x in optionalMax(a, count(x)) }
This way should be clearer than $0 and $1 in code when the accumulator or the single element are used.
Hope this helps
Initialise it with an empty string "" rather than nil. Or you could even initialise it with the first element of the array, but an empty string seems better.
Second go at this after writing some wrong code, this will return the longest string if you are happy with an empty string being returned for an empty array:
toDoItems.reduce("") { count($0) > count($1) ? $0 : $1 }
Or if you want nil, use
toDoItems.reduce(nil as String?) { count($0!) > count($1) ? $0 : $1 }
The problem is that the compiler cannot infer the types you are using for your seed and accumulator closure if you seed with nil, and you also need to get the optional type correct when using the optional string as $0.