I've tried to add the Group {} trick to get more than 10 elements in a Table view, but the fails to compile just like when there is more than 10 elements without the group.
var body: some View {
Table(viewModel.tableArrayOfStructs, sortOrder: $sortOrder) {
Group {
TableColumn("One", value: \.One).width(min: 35, ideal: 35, max: 60)
TableColumn("Two", value: \.Two).width(30)
TableColumn("Three", value: \.Three).width(50)
TableColumn("Four", value: \.Four).width(min: 150, ideal: 200, max: nil)
TableColumn("Five", value: \.Five).width(50)
TableColumn("Six", value: \.Six).width(min: 50, ideal: 55, max: nil)
TableColumn("Seven", value: \.Seven).width(88)
TableColumn("Eight", value: \.Eight).width(88)
TableColumn("Nine", value: \.Nine).width(20)
TableColumn("Ten", value: \.Ten).width(50)
}
TableColumn("Eleven", value: \.Eleven).width(50)
}
If I add the eleventh+ column also into a new group I get the same issue. The compiler reports:
The compiler is unable to type-check this expression in reasonable time; try breaking up the expression into distinct sub-expressions
Is there any means to have a table with more than 10 columns short of dropping down to NSViewRepresentable?
Apple Developer Technical Support was able to help. They recommended being explicit with the value: type information in the TableColumn to give the compiler some help. Without explicit type info with or without the group the compiler simply gives up. I also found that once you add one group, you need to add all TableColumns to be within groups (which have 10 or less TableColums per group)
var body: some View {
Table(viewModel.tableArrayOfStructs, sortOrder: $sortOrder) {
Group {
TableColumn("One", value: \ViewModel.TableArrayOfStructs.One).width(min: 35, ideal: 35, max: 60)
TableColumn("Two", value: \ViewModel.TableArrayOfStructs.Two).width(30)
TableColumn("Three", value: \ViewModel.TableArrayOfStructs.Three).width(50)
TableColumn("Four", value: \ViewModel.TableArrayOfStructs.Four).width(min: 150, ideal: 200, max: nil)
TableColumn("Five", value: \ViewModel.TableArrayOfStructs.Five).width(50)
TableColumn("Six", value: \ViewModel.TableArrayOfStructs.Six).width(min: 50, ideal: 55, max: nil)
TableColumn("Seven", value: \ViewModel.TableArrayOfStructs.Seven).width(88)
TableColumn("Eight", value: \ViewModel.TableArrayOfStructs.Eight).width(88)
TableColumn("Nine", value: \ViewModel.TableArrayOfStructs.Nine).width(20)
TableColumn("Ten", value: \ViewModel.TableArrayOfStructs.Ten).width(50)
}
Group {
TableColumn("Eleven", value: \ViewModel.TableArrayOfStructs.Eleven).width(50)
}
}
Thank you very much for the tip of using Group. I was not sure it was going to work on TableColumn the same way as with other View but it works.
In my case, it was not compulsory to enclose all table columns within group.
I also receive the tip from Apple Developer Technical Support to be explicit about the value type: \ViewModel.property instead of .property.
We can also use computed property to generate a TableColumn like this:
var firstNameColumn: TableColumn<ViewModel, KeyPathComparator<ViewModel>, Text, Text> {
TableColumn("Firstname", value: \ViewModel.firstname)
}
The first Text refer to the \ViewModel.firstname that generates automatically a text while the second Text is for the label "FirstName"
In some case, when the type cannot be compare, you can replace KeyPathComparator by Never.
For instance for a date:
var dateColumn: TableColumn<ViewModel, Never, DateView, Text> {
TableColumn("Date") {doc in
DateView(date: doc.date)
}
}
Where DateView is a custom View.
Hope that helps others
Regards
Vincent
An example as requested in the comment (untested).
var body: some View {
Table(viewModel.tableArrayOfStructs, sortOrder: $sortOrder) {
Group {
self.foo //since foo has the #VB annotation, these will just be “pasted” in here.
}
self.foobar //not a #VB, so it returns a specific view
self.bar(count: …)
}
#ViewBuilder //not always necessary but usually very handy, depending on what you return
var foo: some View {
TableColumn("One", value: \.One).width(min: 35, ideal: 35, max: 60)
TableColumn("Two", value: \.Two).width(30)
TableColumn("Three", value: \.Three).width(50)
TableColumn("Four", value: \.Four).width(min: 150, ideal: 200, max: nil)
TableColumn("Five", value: \.Five).width(50)
TableColumn("Six", value: \.Six).width(min: 50, ideal: 55, max: nil)
TableColumn("Seven", value: \.Seven).width(88)
TableColumn("Eight", value: \.Eight).width(88)
TableColumn("Nine", value: \.Nine).width(20)
TableColumn("Ten", value: \.Ten).width(50)
}
var foobar: some View {
Group {
TableColumn("One", value: \.One).width(min: 35, ideal: 35, max: 60)
TableColumn("Two", value: \.Two).width(30)
TableColumn("Three", value: \.Three).width(50)
}
}
func bar(count: Int) -> some View
{
//as with foo, return your sub view
//annotate with #ViewBuilder if necessary
}
Point is, you don’t have to construct your entire view in the view’s body. Depending on programming style, code organisation,semantics, … ; split up your code into other child view structs or (private) vars and funcs within a view struct.
Related
I am a novice at programming and exploring SwiftUI. I've been tackling a challenge for too long, and hoping that someone can guide me to the right direction!
I want a list of interlinked sliders (as in Interlinked Multiple Sliders in SwiftUI), but with the number of sliders that change dynamically, depending on actions taken by a user.
For example, a user can choose various items, and later on adjust the percentage variable with sliders (and where these percentages are interdependent as in the linked example).
class Items: ObservableObject {
#Published var components = [ItemComponent]()
func add(component: itemComponent){
components.append(component)
}
}
struct ItemComponent: Hashable, Equatable, Identifiable {
var id = UUID().uuidString
var name: String = ""
var percentage: Double
}
Conceptually, it seems I need to do two things to adapt the linked code:
generate an array of Binding with the number of elements equal to Items.Component.EndIndex and
assign each Binding to the percentage of each ItemComponent.
I am fumbling on both. For 1., I can easily manually create any number of variables, e.g.
#State var value1 = 100
#State var value2 = 100
#State var value3 = 100
let allBindings = [$value1, $value2, $value3]
but how do I generate them automatically?
For 2., I can use ForEach() to call the components, or Index, but not both together:
ForEach(Items.components){ component in
Text("\(component.name)")
Text("\(component.percentage)")
}
ForEach(Items.components.indices){ i in
synchronizedSlider(from: allBindings, index: i+1)
}
In broken code, what I want is something like:
ForEach(Items.component){component in
HStack{
Text("component.name")
Spacer()
synchronizedSlider(from: allBindings[$component.percentage], index: component.indexPosition)
}
where allBindings[$component.percentage] is a binding array comprised of each itemComponent's percentage, and the index is an itemComponent's index.
I am happy to share more code if relevant. Any help would be greatly appreciated!
To adapt the existing code you linked, if you're going to have a dynamic number of sliders, you'll definitely want your #State to be an array, rather than individual #State variables, which would have to be hard coded.
Once you have that, there are some minor syntax issues changing the synchronizedBinding functions to accept Binding<[ItemComponent]> rather than [Binding<Double>], but they are pretty minor. Luckily, the existing code is pretty robust outside of the initial hard-coded states, so there isn't any additional math to do with the calculations.
I'm using ItemComponent rather than just Double because your sample code included it and having a model with a unique id makes the ForEach code I'm using for the sliders easier to deal with, since it expects uniquely-identifiable items.
struct ItemComponent: Hashable, Equatable, Identifiable {
var id = UUID().uuidString
var name: String = ""
var percentage: Double
}
struct Sliders: View {
#State var values : [ItemComponent] = [.init(name: "First", percentage: 100.0),.init(name: "Second", percentage: 0.0),.init(name: "Third", percentage: 0.0),.init(name:"Fourth", percentage: 0.0),]
var body: some View {
VStack {
Button(action: {
// Manually setting the values does not change the values such
// that they sum to 100. Use separate algorithm for this
self.values[0].percentage = 40
self.values[1].percentage = 60
}) {
Text("Test")
}
Button(action: {
self.values.append(ItemComponent(percentage: 0.0))
}) {
Text("Add slider")
}
Divider()
ScrollView {
ForEach(Array(values.enumerated()),id: \.1.id) { (index,value) in
Text(value.name)
Text("\(value.percentage)")
synchronizedSlider(from: $values, index: index)
}
}
}.padding()
}
func synchronizedSlider(from bindings: Binding<[ItemComponent]>, index: Int) -> some View {
return Slider(value: synchronizedBinding(from: bindings, index: index),
in: 0...100)
}
func synchronizedBinding(from bindings: Binding<[ItemComponent]>, index: Int) -> Binding<Double> {
return Binding(get: {
return bindings[index].wrappedValue.percentage
}, set: { newValue in
let sum = bindings.wrappedValue.indices.lazy.filter{ $0 != index }.map{ bindings[$0].wrappedValue.percentage }.reduce(0.0, +)
// Use the 'sum' below if you initially provide values which sum to 100
// and if you do not set the state in code (e.g. click the button)
//let sum = 100.0 - bindings[index].wrappedValue
let remaining = 100.0 - newValue
if sum != 0.0 {
for i in bindings.wrappedValue.indices {
if i != index {
bindings.wrappedValue[i].percentage = bindings.wrappedValue[i].percentage * remaining / sum
}
}
} else {
// handle 0 sum
let newOtherValue = remaining / Double(bindings.wrappedValue.count - 1)
for i in bindings.wrappedValue.indices {
if i != index {
bindings[i].wrappedValue.percentage = newOtherValue
}
}
}
bindings[index].wrappedValue.percentage = newValue
})
}
}
I extend my custom protocol with mutable function. And then apply it on the instance of the corresponding type. But instance is changed only on that row. On the next row, it has previous value. Why changes made on the instance doesn't persist?
If I assign the result of mutating to the var/let. Then the result saves. Or if I apply harderWorkout() in the print() statement it print changed value.
struct Workout {
var distance: Double
var time: Double
var averageHR: Int
}
extension Workout: CustomStringConvertible {
var description: String {
return "Workout(distance: \(distance), time: \(time), averageHR: \(averageHR)"
}
}
extension Workout {
mutating func harderWorkout() -> Workout {
return Workout(distance: (self.distance * 2), time: (self.time * 2), averageHR: (self.averageHR + 40))
}
}
var workout = Workout(distance: 500, time: 50, averageHR: 100)
print(workout) //Workout(distance: 500.0, time: 50.0, averageHR: 100, Speed: 10.0
workout.harderWorkout()
print(workout) //Workout(distance: 500.0, time: 50.0, averageHR: 100, Speed: 10.0
In the last print I expected to see Workout(distance: 1000.0, time: 100.0, averageHR: 140 but it's not clear to me why harderWorkout() method doesn't change the workout instance. Maybe it's because of the value type. But I put the mutable prefix...
Will be very thankful if someone explains to me the reason and its mechanism.
Instead of returning Workout instance from harderWorkout() method, assign the new Workout instance to self, i.e.
extension Workout {
mutating func harderWorkout() {
self = Workout(distance: (self.distance * 2), time: (self.time * 2), averageHR: (self.averageHR + 40))
}
}
Alternatively, you can simply change the distance, time and averageHR values of the same instance, i.e.
extension Workout {
mutating func harderWorkout() {
self.distance *= 2
self.time *= 2
self.averageHR += 40
}
}
It is pretty, simple - in your workoutHarder() you create a new Workout and return that, instead of mutating it.
If you expect it to mutate, you will need to to the following:
extension Workout {
mutating func harderWorkout() -> Workout {
self.distance *= 2
self.time *=2
self.averageHR += 40
return self
}
}
You see that it now returns from self, and maybe the method dont need to return at all if you just want it to mutate?
I'd like to have a variable which can represent a single Measurement but may have different unit types. For example, it could store a length or a mass. It seems so simple, but I can't figure it out.
Here's what I tried:
struct Data {
var weight: Measurement<UnitMass>
var length: Measurement<UnitLength>
var target = "Weight"
var valueOfTarget: Measurement<Unit> {
if target == "Weight" {
return weight
} else {
return length
}
}
}
var data = Data(weight: Measurement<UnitMass>(value: 10, unit: UnitMass.kilograms),
length: Measurement<UnitLength>(value: 10, unit: UnitLength.centimeters),
target: "Weight")
print(data.valueOfTarget)
I also tried using <Dimension> as suggested in another answer, but that had a similar error.
This results in a compiler error:
error: cannot convert return expression of type 'Measurement<UnitMass>' to return type 'Measurement<Unit>'
Am I'm missing something obvious or is this just not possible?
You could just create new generic return values. This seems to compile OK for me.
struct Data {
var weight: Measurement<UnitMass>
var length: Measurement<UnitLength>
var target = "Weight"
var valueOfTarget: Measurement<Unit> {
if target == "Weight" {
return Measurement<Unit>(value: weight.value, unit: weight.unit)
} else {
return Measurement<Unit>(value: length.value, unit: length.unit)
}
}
}
var data = Data(weight: Measurement<UnitMass>(value: 10, unit: UnitMass.kilograms),
length: Measurement<UnitLength>(value: 10, unit: UnitLength.centimeters),
target: "Weight")
print(data.valueOfTarget)
First, don't make your own struct named Data, because Foundation already has a type named Data that's pretty widely used.
Second, since (in a comment) you said “It's for weight loss, the user can choose a target as either a weight or a waist size”, it seems like you should probably model this using an enum rather than a struct:
enum WeightLossTarget {
case weight(Measurement<UnitMass>)
case waistSize(Measurement<UnitLength>)
}
Third, if you really need to use a struct, you can fall back to the Objective-C type NSMeasurement for non-generic type:
struct WeightLossTarget {
enum TargetType {
case weight
case waistSize
}
var weight: Measurement<UnitMass>
var waistSize: Measurement<UnitLength>
var target: TargetType
var valueOfTarget: NSMeasurement {
switch target {
case .weight: return weight as NSMeasurement
case .waistSize: return waistSize as NSMeasurement
}
}
}
I'm trying to make a 'Difficulty' type that can takes 3 states : easy, medium or hard. Then a 'minimum' and 'maximum' values will be set automatically and reachable like "myDifficultyInstance.min" or what.
I tried this but doesn't work, I get errors :
enum Difficulty {
case easy(min: 50, max: 200)
case medium(min: 200, max: 500)
case hard(min: 500, max: 1000)
}
Then I tried with a struct but it becomes too weird and ugly.
Is there a simple solution to do that ?
Default arguments are not allowed in enum cases
When you defining cases of enum, you can't define default values. Imagine it as you're just creating "patterns".
But what you can to is, that you can create default cases by creating static constants
enum Difficulty {
case easy(min: Int, max: Int)
case medium(min: Int, max: Int)
case hard(min: Int, max: Int)
static let defaultEasy = easy(min: 50, max: 200)
static let defaultMedium = medium(min: 200, max: 500)
static let defaultHard = hard(min: 500, max: 1000)
}
then you can use it like this
Difficulty.defaultEasy
Difficulty.defaultMedium
Difficulty.defaultHard
Also I think that for your case when you need to get min or max value, would be better if you were using custom data model
struct Difficulty {
var min: Int
var max: Int
static let easy = Difficulty(min: 50, max: 200)
static let medium = Difficulty(min: 200, max: 500)
static let hard = Difficulty(min: 500, max: 1000)
}
I know you accepted an answer already, but if you want to have both preset and customizable difficulty setting i'd suggest doing it like that:
enum Difficulty {
case easy
case medium
case hard
case custom(min: Int, max: Int)
var min : Int {
switch self {
case .easy:
return 50
case .medium:
return 200
case .hard:
return 500
case .custom(let min,_):
return min
}
}
var max : Int {
switch self {
case .easy:
return 200
case .medium:
return 500
case .hard:
return 1000
case .custom(_,let max):
return max
}
}
}
This way you're getting enumerated difficulties (finite exclusive states) with an option to define a custom one.
Usage:
let difficulty : Difficulty = .easy
let customDifficulty : Difficulty = .custom(min: 70, max: 240)
let easyMin = difficulty.min
let easyMax = difficulty.max
let customMin = customDifficulty.min
let customMax = customDifficulty.max
I am running this code in both xcode 9.3 and xcode 10 beta 3 playground
import Foundation
public protocol EnumCollection: Hashable {
static func cases() -> AnySequence<Self>
}
public extension EnumCollection {
public static func cases() -> AnySequence<Self> {
return AnySequence { () -> AnyIterator<Self> in
var raw = 0
return AnyIterator {
let current: Self = withUnsafePointer(to: &raw) { $0.withMemoryRebound(to: self, capacity: 1) { $0.pointee } }
guard current.hashValue == raw else {
return nil
}
raw += 1
return current
}
}
}
}
enum NumberEnum: EnumCollection{
case one, two, three, four
}
Array(NumberEnum.cases()).count
even though both are using swift 4.1 they are giving me different results for the
on xcode 9.3
the size of array is 4
and on xcode 10 beta 3
the size of array is 0
I don't understand this at all.
That is an undocumented way to get a sequence of all enumeration values,
and worked only by chance with earlier Swift versions. It relies on the
hash values of the enumeration values being consecutive integers,
starting at zero.
That definitely does not work anymore with Swift 4.2 (even if running
in Swift 4 compatibility mode) because hash values are now always
randomized, see SE-0206 Hashable Enhancements:
To make hash values less predictable, the standard hash function uses a per-execution random seed by default.
You can verify that with
print(NumberEnum.one.hashValue)
print(NumberEnum.two.hashValue)
which does not print 0 and 1 with Xcode 10, but some
other values which also vary with each program run.
For a proper Swift 4.2/Xcode 10 solution, see How to enumerate an enum with String type?:
extension NumberEnum: CaseIterable { }
print(Array(NumberEnum.allCases).count) // 4
The solution for this is below for Xcode 10 and Swift 4.2 and above.
Step 1: Create Protocol EnumIterable.
protocol EnumIterable: RawRepresentable, CaseIterable {
var indexValue: Int { get }
}
extension EnumIterable where Self.RawValue: Equatable {
var indexValue: Int {
var index = -1
let cases = Self.allCases as? [Self] ?? []
for (caseIndex, caseItem) in cases.enumerated() {
if caseItem.rawValue == self.rawValue {
index = caseIndex
break
}
}
return index
}
}
Step 2: Extend EnumIterator Protocol to your enums.
enum Colors: String, EnumIterable {
case red = "Red"
case yellow = "Yellow"
case blue = "Blue"
case green = "Green"
}
Step 3: Use indexValue property like using hashValue.
Colors.red.indexValue
Colors.yellow.indexValue
Colors.blue.indexValue
Colors.green.indexValue
Sample Print statement and Output
print("Index Value: \(Colors.red.indexValue), Raw Value: \(Colors.red.rawValue), Hash Value: \(Colors.red.hashValue)")
Output: "Index Value: 0, Raw Value: Red, Hash Value: 1593214705812839748"
print("Index Value: \(Colors.yellow.indexValue), Raw Value: \(Colors.yellow.rawValue), Hash Value: \(Colors.yellow.hashValue)")
Output: "Index Value: 1, Raw Value: Yellow, Hash Value: -6836447220368660818"
print("Index Value: \(Colors.blue.indexValue), Raw Value: \(Colors.blue.rawValue), Hash Value: \(Colors.blue.hashValue)")
Output: "Index Value: 2, Raw Value: Blue, Hash Value: -8548080225654293616"
print("Index Value: \(Colors.green.indexValue), Raw Value: \(Colors.green.rawValue), Hash Value: \(Colors.green.hashValue)")
Output: "Index Value: 3, Raw Value: Green, Hash Value: 6055121617320138804"
If you use hashValue of an enum to determine case values (position or id), it's a wrong approach since it doesn't guarantee to return sequential int values, 0,1,2... It is not working anymore from swift 4.2
For example, If you use an enum like this :
enum AlertResultType {
case ok, cancel
}
hashValue of this enum might return a large int value instead of 0 (ok) and 1 (cancel).
So you may need to declare the enum type more precisely and use rowValue. For example
enum AlertResultType : Int {
case ok = 0, cancel = 1
}