How to downcast without using a new variable in Swift? - swift

I only know of this way to downcast
let x = y as? subclass
x.subclassmethod()
But this introduce a new variable x, is there a way to just operate on y?
Most language would allow you to do this
((x)y).subclassmethod()

The Swift way is to use parentheses
(y as? subclass)?.subclassmethod()
The second ? is necessary to optional chain the expression.

Related

Instantiating a class with "!" in Swift? [duplicate]

I understand that in Swift all variables must be set with a value, and that by using optionals we can set a variable to be set to nil initially.
What I don't understand is, what setting a variable with a ! is doing, because I was under the impression that this "unwraps" a value from an optional. I thought by doing so, you are guaranteeing that there is a value to unwrap in that variable, which is why on IBActions and such you see it used.
So simply put, what is the variable being initialized to when you do something like this:
var aShape : CAShapeLayer!
And why/when would I do this?
In a type declaration the ! is similar to the ?. Both are an optional, but the ! is an "implicitly unwrapped" optional, meaning that you do not have to unwrap it to access the value (but it can still be nil).
This is basically the behavior we already had in objective-c. A value can be nil, and you have to check for it, but you can also just access the value directly as if it wasn't an optional (with the important difference that if you don't check for nil you'll get a runtime error)
// Cannot be nil
var x: Int = 1
// The type here is not "Int", it's "Optional Int"
var y: Int? = 2
// The type here is "Implicitly Unwrapped Optional Int"
var z: Int! = 3
Usage:
// you can add x and z
x + z == 4
// ...but not x and y, because y needs to be unwrapped
x + y // error
// to add x and y you need to do:
x + y!
// but you *should* do this:
if let y_val = y {
x + y_val
}

Pointer dereferencing Swift

In C & Objective C, we used to dereference a pointer and get the value as follows:
p->a = 1
or int x = p->a
But I can't find an equivalent in Swift. I have a return type UnsafePointer to AudioStreamBasicDescription? whose member values I need to read.
You use the pointee property on your UnsafePointer to access the memory it points to. So your C example would read as let x = p.pointee.a.

Binary operator '*' cannot be applied to operands of type 'Float' and 'Float!'

When I do the following:
let gapDuration = Float(self.MONTHS) * Float(self.duration) * self.gapMonthly;
I get the error:
Binary operator '*' cannot be applied to operands of type 'Float' and 'Float!'
But when I do:
let gapDuration = 12 * Float(self.duration) * self.gapMonthly;
Everything is working fine.
I have no Idea what this error is telling me.
self.gapMonthly is of type Float! and self.duration and self.MONTHS are of type Int!
I would consider this a bug (at the very least, the error is misleading), and appears to be present when attempting to use a binary operator on 3 or more expressions that evaluate to a given type, where one or more of those expressions is an implicitly unwrapped optional of that type.
This simply stretches the type-checker too far, as it has to consider all possibilities of treating the IUO as a strong optional (as due to SE-0054 the compiler will treat an IUO as a strong optional if it can be type-checked as one), along with attempting to find the correct overloads for the operators.
At first glance, it appears to be similar to the issue shown in How can I concatenate multiple optional strings in swift 3.0? – however that bug was fixed in Swift 3.1, but this bug is still present.
A minimal example that reproduces the same issue would be:
let a: Float! = 0
// error: Binary operator '*' cannot be applied to operands of type 'Float' and 'Float!'
let b = a * a * a
and is present for other binary operators other than *:
// error: Binary operator '+' cannot be applied to operands of type 'Float' and 'Float!'
let b = a + a + a
It is also still reproducible when mixing in Float expressions (as long as at least one Float! expression remains), as well as when explicitly annotating b as a Float:
let b: Float = a * a * a // doesn't compile
let a: Float! = 0
let b: Int = 0
let c: Int = 0
let d: Float = a * Float(b) * Float(c) // doesn't compile
A simple fix for this would be to explicitly force unwrap the implicitly unwrapped optional(s) in the expression:
let d = a! * Float(b) * Float(c) // compiles
This relieves the pressure on the type-checker, as now all the expressions evaluate to Float, so overload resolution is much simpler.
Although of course, it goes without saying that this will crash if a is nil. In general, you should try and avoid using implicitly unwrapped optionals, and instead prefer to use strong optionals – and, as #vadian says, always use non-optionals in cases where the value being nil doesn't make sense.
If you need to use an optional and aren't 100% sure that it contains a value, you should safely unwrap it before doing the arithmetic. One way of doing this would be to use Optional's map(_:) method in order to propagate the optionality:
let a: Float! = 0
let b: Int = 0
let c: Int = 0
// the (a as Float?) cast is necessary if 'a' is an IUO,
// but not necessary for a strong optional.
let d = (a as Float?).map { $0 * Float(b) * Float(c) }
If a is non-nil, d will be initialized to the result of the unwrapped value of a multiplied with Float(b) and Float(c). If however a is nil, d will be initialised to nil.

Swift Hoisting? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 7 years ago.
Improve this question
As everyone knows, JavaScript "Hoists" the variables to the top of the file or scope. But from my understanding, using let is the same as var only let is confined to the scope it is defined in.
Being a compiled language instead of an interpreted, can we assume Swift does NOT do this?
For example:
x = 10
var y = x + 10
var x
can we assume Swift does NOT do this?
You can assume whatever you want, but no matter the programming language, your absolute best bet it to just try it and find out. If you would have pasted your sample code into a Playground or in any Swift-capable IDE, or just tried running it through the command line, you'd quickly find out that this simply does not work.
Your question is somewhat confusing, but I think I can address all or at least most of your questions.
x = 10
var y = x + 10
var x
Assuming there is no other code to go with your original sample, it simply does not compile. All three lines have a problem.
The first two lines complain about the use of an unresolved identified 'x'. In English, this means Swift can't figure out what variable you're talking about. Variables in Swift must be declared before they are used, so the declaration on line three doesn't help the next two lines.
The third line complains that it can't figure out what type x should be. "Type annotation missing in pattern". In some cases, Swift can figure out what type our variable should be. For example, with var x = 10, Swift can figure out that x's type should be Int. If we want something else, we must specify. But if we aren't assigning a value in the declaration, Swift has no idea and must be told: var x: Int?
What about the case where x exists at a different scope?
Well, Swift allows variable shadowing. That is to say, a variable declared at one scope hides a variable declared at another scope.
So, for example:
class Foo {
let x = 10
func foo(value: Int) -> Int {
let a = value * self.x
let x = 10
return a * x
}
}
Now we can use some x before we've declared a locally scoped x, but these are different variables. Also, perhaps most importantly, notice the self. prepended to x here. It's required. Without it, Swift will refuse to compile this code and will complain: "Use of local variable 'x' before its declaration."
However, within functions of classes is not the only place we can shadow variables. We can also do it within if blocks (among other places) which is where things can get a little more confusing. Consider this:
class Foo {
let x = 10
func foo(value: Int) -> Int {
print(x)
if x > 3 {
let x = 2
print(x)
}
return x
}
}
Here, we've used x twice before declaring it. And we didn't have to make use of the self. and it doesn't complain and compiles perfectly fine. But it's important to note that outside the if block (including the x > 3 conditional) the x we're referencing is the instance variable, but inside the if block, we've created a new variable called x which shadows the instance variable. We can also create the same sort of shadowing by using if let and if var constructs (but not guard let).
The result of calling this function will be that the value 10 is printed, we enter the if block, the value of 2 is printed, then we exit the if block and the value of 10 is returned.
Now let's through var into the mix here. First, if you aren't already aware, you should start by reading this which explains the difference between let and var (one is a constant, the other is not).
Let's combine let vs var with scoping and variable shadowing to see how it effects things.
class Foo {
let x = 10
func foo(value: Int) -> Int {
print(x)
if x > 3 {
var x = 2
while x < 10 {
print(x)
x += 3
}
return x
}
return x
}
}
Okay, so this is the same as before but with a slightly more complicated bit within the if block.
Here, our locally-scoped variable is declared as a var while our instance variable remains a constant let. We cannot modify the instance variable, but we can modify the local variable. And we do so, on each iteration of the while loop.
But importantly, this variable is an entirely different variable from the instance variable. It might as well have a completely different name (and in practice, it basically always should have a different name). So modifying our local variable x doesn't change anything about our more broadly scoped instance variable x. They're different variables that reside in different memory locations. And once a variable is declared as a let or a var, that variable cannot be changed to the other.
'let' and 'var' do not have a difference in scope.
Global variables are variables that are defined outside of any
function, method, closure, or type context. Local variables are
variables that are defined within a function, method, or closure
context.

Assigning last array element to a variable in Swift

I have this very simple line of code
var dblArray : [Double] = [0.01]
var x = dblArray.last
println(x * x)
The '.last' module returns the last element of the array, which is 0.01. However, based on the playground assistant view, it shows that the actual assignment to var x is (Some 0.01). And doing a println will lead to "Optional 0.01"
What I'm hoping to accomplish is merely capturing the value of the last element and placing it in x.
What am I doing wrong here?
I'm pretty certain .last would have to be an optional, if only to handle the edge case of an empty array, where .last would make no sense as a "solid" value.
In any case, if you're sure the array won't be empty, just unwrap the value. If you're not sure then you'll need to check intelligently such as with:
var x = 0
if let junk = dblArray.last {
x = junk
}
I think that's the correct syntax, I don't have my Mac with me at the moment, but it should hopefully be close enough to show the concept.