Binding Swift properties to NSTableView? - swift

I think I have programmed myself into a corner, but I'm hoping you all know a way out. I have a class...
class Card {
var order: Int? = -1
var tag: String = "0"
var comment: String?
var data : [String: NSNumber]
}
Ideally everything would be in data, which is a few strings and lots of numbers. I started with [String, String] but I found I was writing lots of code to cast and convert when I wanted to (say) compare one of those numbers to zero. Changing it to [String, NSNumber] simplified all that code, but now my tableViewDataSource becomes very complex because some of the data is in data and some is in a separate property like comment. I even tried [String, Any], but then everything had to be cast all the time to do anything.
I have a feeling I am missing something fundamental here. When working with NSTableViews, is there a simple way to use Swift properties that I'm missing? valueForKey: does not work, there's no easy way to do a reflection-like solution I know of, etc. Any suggestions?

You can only bind dynamic properties, and your class needs to inherit from NSObject or implement NSObjectProtocol. Additionally, nilable value-types aren't allowed, so you cannot bind Int?
ie.:
class Card: NSObject {
dynamic var order: Int = -1
dynamic var tag: String = "0"
dynamic var comment: String?
dynamic var data: [String: NSNumber]
}

Related

Swift initialise property at struct initialisation without initialiser

In swift, structs have an automatically generated memberwise initializer.
This means the following struct can be initialised without me having to write an init.
struct Activity {
let name: String
let desc: String
let category: Category
let subcategory: Subcategory
let emoji: Character
let coordinate: CLLocationCoordinate2D
let creationTime: Date = Date()
let activityTime: Date
let id: UUID = UUID()
var comments: [Comment] = []
}
I have one single property called emojiwhich is computed by the subcategory. In other words, the value for emoji depends on the value of subcategory.
However this means that the value of emoji can only be assigned after the initialisation of subcategory.
How should I do this in code?
Approach 1:
Provide my own initialiser
init(name: String, desc: String, category: Category, subcategory: Subcategory,
coordinate: CLLocationCoordinate2D, activityTime: Date) {
self.name = name
self.desc = desc
self.category = category
self.subcategory = subcategory
self.coordinate = coordinate
self.activityTime = activityTime
self.emoji = AllCategories.categories[category]?[subcategory] ?? "❌"
}
I don't like this approach as it adds a lot of unecessary code that will only grow if I add more properties... I would like to use the generated initialiser of the struct. On the other hand, the code is still very simple.
Approach 2:
Use a lazy varthat is only computed when called.
lazy var emoji: Character = {
AllCategories.categories[category]?[subcategory] ?? "❌"
}()
I also don't really like this approach, I find it overly complex for what I am trying to do. Also this makes emojia varinstead of letwhich it is not, I want it to remain a constant. On the other hand, I can continue using the automatically generated initialiser.
Questions:
What other possibilities do I have?
If there are none, which of the 2 approches is the best?
This sounds like a great chance to use computed properties:
var emoji: Character {
AllCategories.categories[category]?[subcategory] ?? "❌"
}
Although it is declared a var, you can't actually set it. It's just how computed properties must be declared.
The expression AllCategories.categories[category]?[subcategory] ?? "❌" will be evaluated every time you use the property. It's not a too time-consuming expression, so IMO it's fine.

initialise Codable result with var

I need to initialise the result of JSONDecoder in a var object defined outside the API Call.
apiService.GETAPI(url: apiStr, completion: {(success, result) in
if(success) {
let apiResponse = result.value as! NSDictionary
let data = apiResponse.value(forKey: "data") as! NSDictionary
do {
let profileData = try JSONSerialization.data(withJSONObject: data.value(forKey: "profile"), options: .prettyPrinted)
print(profileData)
let profile = try JSONDecoder().decode(Profile.self, from: profileData)
print(profile.name)
}
catch {
print("json error: \(error.localizedDescription)")
}
}
completion(success)
})
But I am unable to do so. This is my Profile Codable struct
struct Profile : Codable {
var id : Int
var name : String
var member_id : Int
var category_id : Int
var membership_id : Int
var information : String
var city : String
var love_count : Int
var vendor_price : String
var locality_name : String
var phone : [String]
var address : [Address]?
var status : Int?
var managed_by_wmg : Int?
}
How to do it. I need it to be var since I need to perform operation and access it later in the other code.
As we have already discussed in the comments of your question, you need to declare variable of Profile type outside the closure.
So now the problem become "How to change Profile struct, so I can declare variable of it?"
For this you have several options. I will list them and comment on which to choose and why.
Option 1
Add default values to all the variables of Profile struct as so: var id : Int = 0, so you can declare it later with var profile: Profile?. With this solutions you need to have some knowledge about the objects you are expecting and their default values in a way they make sense to your business logic. Example for such default values are "" for string or 0 integer. This might seem good but yet their is better solution.
Option 2
Make ALL the variables of Profile optional as so: var id : Int?. This might sounds strange at first, but is the optimal solution for working with server data. So this method has several benefits:
It will never crash, no matter what the server sends you as a data
All the variables have default values equaling nil
You do not need to think about your business logic and what default value suits your needs
However, with this method there is one drawback: some properties that with your logic could never be nil will have nil initial value. So you need to add validations for unwrapping the nullable properties.
Option 3 Add explicit unwrapping to the variable type as so var id : Int!. Note this also adds nil as initial value to your properties, but tells the compiler that in every time you are gonna use them, they will not be nil. I would personally not recommend this method, because if the variable is not received from the server, or the parsing fails, or something else unexpected happens, your app will crash with found nil while unwrapping optional value. For reference you might have noticed all the IBOutlets are defined this way.
Having said all that, I would recommend you going with Option 2, which best suits server communication, but the choice is yours!
Happy coding.
I have found two solutions to my question. Thanks to #dvp.petrov.
Solution 1 var profile : Profile! comes handy but it becomes difficult if this is not an optional variable and have to be used at many places. One will have to put lot of checks and unwrap it every time. This is easy to use when one has to use it at very less places
Solution 2 : Create an init function giving default values to all the variable and then you can create var profile : Profile(). This way your object will not be nil and will be able to use it at places.
struct Profile : Codable {
var id : Int
var name : String
var member_id : Int
init() {
self.id = 0
self.name = ""
self.member_id = 0
}

Different ways to initialize a dictionary in Swift?

As far as I know, there are 4 ways to declare a dictionary in Swift:
var dict1: Dictionary<String, Double> = [:]
var dict2 = Dictionary<String, Double>()
var dict3: [String:Double] = [:]
var dict4 = [String:Double]()
It seems these four options yields the same result.
What's the difference between these?
All you're doing is noticing that you can:
Use explicit variable typing, or let Swift infer the type of the variable based on the value assigned to it.
Use the formal specified generic struct notation Dictionary<String,Double>, or use the built-in "syntactic sugar" for describing a dictionary type [String:Double].
Two times two is four.
And then there are in fact some possibilities you've omitted; for example, you could say
var dict5 : [String:Double] = [String:Double]()
And of course in real life you are liable to do none of these things, but just assign an actual dictionary to your variable:
var dict6 = ["howdy":1.0]

Why does Swift BooleanLiteralConvertible require a boolean literal?

I am trying to add BooleanLiteralConvertible support to my class so I can instantiate it with a boolean. The thing that's throwing me for a loop is the distinction between a boolean value and a boolean literal.
For example, after adding the protocol I attempted this:
func setSelected(value: Bool) {
var node: MyClass = value
}
But Swift complained that it cannot convert Bool to MyClass. It took me a while to realize it has to be a boolean literal. Oddly enough the following works fine:
func setSelected(value: Bool) {
var node: MyClass = value ? true : false
}
…which seems just absolutely silly to me. Is there a legitimate reason for this seemingly very bizarre requirement?
Types conforming to BooleanLiteralConvertible can be initialized with the Boolean literals true and false, e.g.
let mc : MyClass = true
This has nothing to do with initializing the type with a Boolean value:
let value : Bool = // ... some boolean value
let mc : MyClass = value // error: cannot convert value of type 'Bool' to specified type 'MyClass'
and there is – as far as I know – no way to make such an implicit
conversion work. You would have to write a custom init method
init(bool : Bool) {
// ...
}
and initialize the object as
let value : Bool = // ... some boolean value
let mc = MyClass(bool: value)
I like the question. Only the Swift team could definitively answer, but I can speculate as to why: converting a typed value into a variable of a different type without an explicit conversion or cast is very easy to confuse with a programmer error, and in many cases is something the compiler should warn about.
Example (and assume that Person is also a StringLiteralConvertible that can be initialized with a string variable as well as a literal as you pose in your question):
struct Person {
private static var idCounter = 1
var name:String
let id:Int
init(withName name:String) {
Person.idCounter += 1
self.name = name
self.id = Person.idCounter
}
}
var person = Person(withName:"Mary")
let name = "John"
person = name
The above code looks suspiciously like a mistake, where the programmer is assigning a value of the wrong type (String) to a variable of type Person. It may in fact be a mistake. Maybe the programmer only meant to change the name of the person (person.name = name) without creating a new Person with a new unique id. Or maybe the programmer intended to assign some other value to person but made a typo or code completion error. Hard to tell without either being the original programmer, or carefully studying all the context to see whether this conversion makes sense. And it gets harder the further the assignment is from the place where the variables are originally initialized Should the compiler warn here that a value of type String is being assigned to a variable of type Person?
The example would be far more clear, and more in line with Swift conventions as:
var person = Person(withName:"Mary")
let name = "John"
person = Person(withName:name)
The above version is completely unambiguous, both to the compiler and to any other programmers who read this later.

Defining Nested Dictionaries in Swift

In the below code, I get an error [String : Double] does not conform to Hashable. How do i get around this?
I see the problem of non-conformance to Hashable protocol, but i'm wondering why this would be the case , that other way works. Is only 'Key' in a dictionary is required to confirm to Hashable? Some explanation would help
enum someEnumType {
case First(String, (Int, Int)->Int)
case Second (String, Int)
}
// var operations = [someEnumType : [String : Double]](); <--- This syntax Works
var operations = [[String : Double] : someEnumType ](); <--- But this does not work, ideally - i want this.
Dictionaries are also called† hash tables; they work by hashing the key. So, yes, it does need to be Hashable. The value doesn't since the point is to look up values by key.
† Well, strictly speaking one could implement a dictionary without hashing, but in practice a data structure called a dictionary in programming languages is usually understood to be a hash map. In Swift as well, the Dictionary documentation specifies it as a “hash-based mapping”.
You are correct, only a Dictionary's key must conform to the Hashable protocol.
"How do I get around this?"
Probably the most direct way of using a Dictionary the way you want to is to define your own key type. A struct makes sense here; it offers the same value semantics that your [String: Double] would-be key offers, and it is easy to define:
struct MyKey {
let myString: String
let myDouble: Double
}
Of course, it must be hashable to be used as a Dictionary key, so we add Hashable conformance:
struct MyKey: Hashable {
let myString: String
let myDouble: Double
var hashValue: Int {
return self.myString.hashValue ^ self.myDouble.hashValue
}
}
I did a cute trick there to calculate a "unique-ish" hash value for this key type: just an XOR of the string's and double's hash values. I won't guarantee uniqueness, but it should be good enough for most cases. Calculating a better hash value is an exercise I'll leave up to you if you want (but it will work just fine like this).
Finally, to conform to Hashable, one must also conform to Equatable. In our case we'll just check to see if each of a key's properties match the other's to determine if keys are equal. The full implementation:
struct MyKey: Hashable {
let myString: String
let myDouble: Double
var hashValue: Int {
return self.myString.hashValue ^ self.myDouble.hashValue
}
}
func ==(lhs: MyKey, rhs: MyKey) -> Bool {
return lhs.myString == rhs.myString && lhs.myDouble == rhs.myDouble
}
Now, you may define your Dictionary like this:
var operations = [MyKey : someEnumType ]()
And add an entry like this:
let myFirstKey = MyKey(myString: "Hello", myDouble: 1.0)
operations[myFirstKey] = someEnumType
Yes, the key needs to be hashable.
You should be able to work around this by using NSString rather than String.