I'm trying to declare an array of a static size. I'd like a constant to define the size of the array.
I'm trying the following in Swift
class foo {
let size = 10
let myArray = [Int](count: size, repeatedValue: 0)
}
But this fails with an error,
'foo.Type' does not have a member named 'size'
If I don't use the size constant, the compiler is happy with it but isn't what I'd like. And there's no #define capability that I'm aware of.
let myArray = [Int](count: 10, repeatedValue: 0)
Swift gives you a couple ways to do this. The simplest, and most in line with the #define style you mention, is to declare size as a global constant:
let FOOSIZE = 10
class Foo {
let myArray = [Int](count: FOOSIZE, repeatedValue: 0)
}
Alternatively, you can define myArray as a lazy variable, and use a closure to populate its value. By the time the closure is executed you'll be able to access self.size:
class Foo {
let size = 10
lazy var myArray: [Int] = { [Int](count: self.size, repeatedValue: 0) }()
}
In swift self is unavailable until all class/struct properties have been initialized, and a subclass initializer has been called (in case of an inherited class).
In your case you are initializing properties outside of an initializer, but that doesn't change the result: you cannot initialize a variable implicitly referencing self (which you do when accessing to the size property).
However, size looks like a constant, and as such it's better to instantiate it once (as a static property) rather than having it created in each class instance. Swift doesn't support static class properties, but structs do, so a trick is to define an inner private struct, containing the static immutable properties you may need:
class foo {
private struct Static {
static let size = 10
}
let myArray = [Int](count: Static.size, repeatedValue: 0)
}
With Swift 1.2 you can simply add static before let size, making it a class constant and as such defined before myArray is being defined:
class foo {
static let size = 10
let myArray = [Int](count: size, repeatedValue: 0)
}
Be aware though, that using size later in your code requires you to fully qualify it as foo.size.
One way is for it to be in a function. This worked in a playground:
import UIKit
class ViewController: UIViewController {
var myArray = [Int]()
func appendArray (#index: Int, value: Int) {
myArray[index] = value
}
override func viewDidLoad() {
super.viewDidLoad()
let size = 10
myArray = [Int](count: size, repeatedValue: 0)
appendArray(index: 3, value: 4)
println(myArray)
}
}
Okay, I used a ViewController because it was convenient, but not necessary. No problem declaring the array out side of a function. I still used a function to create the array and used another one t change a value.
Related
TL;DR:
I want a protocol to provide default init behavior, but the compiler resists adopters adding more stored properties. I solved this with composition instead of inheritance, but what's wrong with my original approach?
Motivation
I want to automate the transformation of objects from design specifications to runtime specs. I use the example of scaling a CGSize but the intent is more general than just geometric layout. (IOW e.g. my solution won't be to adopt/reject/rewrite autolayout.)
Code
You can paste this right into a Playground, and it will run correctly.
protocol Transformable {
var size : CGSize { get } // Will be set automatically;
static var DESIGN_SPEC : CGSize { get } // could be any type.
init(size: CGSize) // Extension will require this.
}
// A simple example of transforming.
func transform(_ s: CGSize) -> CGSize {
CGSize(width: s.width/2, height: s.height/2)
}
// Add some default behavior.
// Am I sinning to want to inherit implementation?
extension Transformable {
init() { self.init(size: transform(Self.DESIGN_SPEC)) }
// User gets instance with design already transformed. No muss, fuss.
}
// Adopt the protocol...
struct T : Transformable {
let size: CGSize
static let DESIGN_SPEC = CGSize(width: 10, height: 10)
}
// ...and use it.
let t = T()
t.size // We get (5,5) as expected.
But every Eden must have its snake. I want a Transformable with another property:
struct T2 : Transformable {
// As before.
let size: CGSize
static let DESIGN_SPEC = CGSize(width: 10, height: 10)
let i : Int // This causes all sorts of trouble.
}
Whaa? Type 'T2' does not conform to protocol 'Transformable'
We have lost the synthesized initializer that sets the size member.
So... we put it back:
struct T3 : Transformable {
// As before.
let size: CGSize
static let DESIGN_SPEC = CGSize(width: 10, height: 10)
let i : Int
init(size: CGSize) {
self.size = size
self.i = 0 // But this is a hard-coded value.
}
}
But now our new member is statically determined. So we try adding another initializer:
struct T4 : Transformable {
// As before.
let size: CGSize
static let DESIGN_SPEC = CGSize(width: 10, height: 10)
let i : Int
init(size: CGSize) { self.size = size ; self.i = 0 }
// Try setting 'i':
init(i: Int) {
self.init() // Get the design spec properly transformed.
self.i = i // 'let' property 'i' may not be initialized directly;
} // use "self.init(...)" or "self = ..." instead
}
Declaring i as var shuts the compiler up. But i is immutable, and I want i that way. Explain to me why what I want is so wrong... This page is too small to include all the variations I tried, but perhaps I have missed the simple answer.
Suppose that I have an array of UIView?s that have a value by default:
var firstView: UIView? = UIView()
var secondView: UIView? = UIView()
let views = [firstView, secondView]
I want to change every value that's in the array to nil. The ideal solution would be iterating through the elements of the array and setting them to nil. However, this does not work:
print(firstView) //Optional(<UIView: [address]; frame = (0 0; 0 0); layer = <CALayer: [address]>>)
for i in 0 ..< views.count {
views[i] = nil
}
//firstView is still not nil
print(firstView) //Optional(<UIView: [address]; frame = (0 0; 0 0); layer = <CALayer: [address]>>)
What could be the solution for this?
You could use KeyPaths:
class Foo {
let views = [\Foo.view1, \Foo.view2]
var view1: UIView? = UIView()
var view2: UIView? = UIView()
func freeAll() {
print(view1 as Any, view2 as Any)
for keyPath in views {
self[keyPath: keyPath] = nil
}
print(view1 as Any, view2 as Any)
}
}
let foo = Foo()
foo.freeAll()
Output:
Optional(<UIView: 0x7fc5c4107bb0; frame = (0 0; 0 0); layer = <CALayer: 0x60000213e760>>) Optional(<UIView: 0x7fc5c41057a0; frame = (0 0; 0 0); layer = <CALayer: 0x60000213d780>>)
nil nil
In this case, the views array doesn't contain references to the objects, but instead it stores KeyPaths which you can think of as directions how to find them. That allows you to access the objects and set them to nil if desired.
As MartinR said in this comment, it's not possible to directly modify the address of an instance variable. Some possible solutions for this problem:
As MartinR points out, you can store the views only in that array and nowhere else. That way, when you make every element in your array nil, there won't be any references for the views. However, this can easily produce an overly long or hard-to-understand code.
A usually better solution for UIViews is creating a custom view and put all elements in it that you'd store in an array otherwise. That way, you can easily make that one view nil and all subviews will be nil (since they rely on the superview).
I've set up an if statement which does things depending on an int value stored in a UILabel, however, my code can't determine the value of the int inside the label.
This is because the if statement runs within the viewdidload function. I'm basically looking for how I can pass the label value to my view. I'm struggling to make it work.
Here's the code:
override func viewDidLoad()
{
// If statement
var NumberRead = Int(Number.text!)
if NumberRead! <= 2 {
Picture.image = Pic1
} else {
Picture.image = Pic2
}
}
#IBOutlet weak var Number: UILabel!
Any suggestions for better ways to handle this would be amazing.
Thanks
I don't think the problem is that you use this code in viewDidLoad, but that your UILabel is empty when you run the code and the code is unwrapping a nil value. You have to set the UILabel Number to a value before you use the if statement.
I don't know what value you expect in the Number label, but either set it with an initial value, e.g. 0, or when this view controller is called through segueing from another view controller pass on the value the label should have. The code should then work fine.
How about:
set the default image to pic2, then use optional binding to check the value
Picture.image = Pic2
if let numTxt = Number.text {
if let num = Int(numTxt) {
if num <= 2 {
Picture.image = Pic1
}
}
}
You should separate your data and logic from the view. It is absurd to put a value into a label and try to interpret it. Please consider reading up on the MVC (model view controller) pattern which is the underpinning of iOS.
Your view controller should have a variable, say number to keep track of the value that determines which image to show.
var number: Int = 0 {
didSet {
imageView.image = number <= 2 ? pic1 : pic2
label.text = "\(number)"
}
}
On naming
Your variable names are also a mess. By convention variables are lowerCamelCase. If you are naming an UIImageView, the name picture is too generic and could be misleading. Also avoid names like pic1, pic2, better is something like activeImage, inactiveImage.
I want to be able to make two variables available to the entire SKScene, and all functions inside of it. One of these variable using the other one to create its value. I understand why I cannot do this, but I don't know a fix for it. I have this code:
class GameScene: SKScene {
let num : CGFloat = 1.25
let reciprocal = 1 / num // <— This Line
override func sceneDidLoad() {
}
override func update(_ currentTime: TimeInterval) {
// Called before each frame is rendered
}
}
But I am obviously getting an error the line 4.
Cannot use instance member 'num' within property initializer; property
initializers run before 'self' is available
This means that I cannot use the variable because it is connected to the skscene, and the scene hasn't been implemented fully yet. Is there a way to declare this variable without throwing an error and making it assessable everywhere within this class?
Since reciprocal depends directly upon num, it could make sense to let the prior be a computed property based on the latter
class GameScene: SKScene {
let num: CGFloat = 1.5
var reciprocal: CGFloat { return 1/self.num }
// ...
}
Since num is an immutable property and will never change at runtime, another alternative is to let reciprocal be a lazy variable, computed upon its first use
class GameScene: SKScene {
let num: CGFloat = 1.5
lazy var reciprocal: CGFloat = { return 1/self.num }()
// ...
}
(Or, implement your own custom initializer for the GameScene, where you can initialize num and reciprocal to e.g. a given value and its reciprocal, respectively).
The last line of buildPalette() crashes with EXC_BAD_INSTRUCTION (code=EXC_I386_IVOP, sub code = 0x0). It's ok when I use a different type, like Int instead of CGColor.
Curiously, I can use a local variable for the array and fill it with colours, but then it crashes with the same message when I try to return it from buildPalette().
#objc class Palette {
var palette: [CGColor]
init() {
palette = [CGColor]()
buildPalette()
}
func buildPalette() {
let rgb = CGColorSpaceCreateDeviceRGB()
// omitting the loop for simplicity
let color = CGColorCreate(rgb, [1.0, 1.0, 1.0, 1.0])
palette.append(color) // crashes here
}
}
EDIT: As Maciej Trybiło 1 has commented this issue is fixed in Xcode Beta 5
You are using CGColor in array so it is not able to properly convert(bridged to objective c) in swift AnyObject array.When you use CGColor in array it is not directly covertable in swift as AnyObject within array so that cause runtime error while working with objective c classes.You need to define your var palette as [AnyObject] to proper conversion between CoreGraphics to swift or you can use UIColor.
From Swift docs
When you bridge from a Swift array to an NSArray object, the
elements in the Swift array must be AnyObject compatible. For example,
a Swift array of type Int[] contains Int structure elements. The Int
type is not an instance of a class, but because the Int type bridges
to the NSNumber class, the Int type is AnyObject compatible.
Therefore, you can bridge a Swift array of type Int[] to an NSArray
object. If an element in a Swift array is not AnyObject compatible, a
runtime error occurs when you bridge to an NSArray object.
You can also create an NSArray object directly from a Swift array literal, following the same bridging rules outlined above. When you
explicitly type a constant or variable as an NSArray object and assign
it an array literal, Swift creates an NSArray object instead of a
Swift array.
So you need to make CGColor compatible with AnyObject.So below code will work fine.
#objc class Palette {
var palette: [AnyObject]
init() {
palette = [AnyObject]()
buildPalette()
}
func buildPalette() {
let rgb = CGColorSpaceCreateDeviceRGB()
// omitting the loop for simplicity
let color:CGColorRef = CGColorCreate(rgb, [1.0, 1.0, 1.0, 1.0])
palette.append(color) // crashes here
}
}
Above method works with CGColor but You can use UIColor instead of CGColorand can convert your CGColor by UIColor(CGColor: CGColor?).