How do I define a constrained type in Swift? - swift

I keep bumping onto this problem repeatedly. In real life I see sets of numbers that represent a particular quality but I have difficulties to express them as distinct type in Swift.
For example the percent type. Let says I would want to have a Percent type with only integers. Also this percent would never be able to go over 100 or below zero.
I could express that in pure C as a union, with members ranging from 0 to 100. However using the Swift enum for that with underlying value type doesn't seem to me like to correct approach. Or is it?
Let's pick another one. Bribor Interbank Interest rate. I know it will always be a range between 0 and 20 percent. But the number itself will be a decimal with two decimal places.
What's the correct way to deal with this problem in Swift? Generics perhaps?

As Michael says in the comment, probably something like this:
struct IntPercent {
let value : Int8
init?(_ v : Int) {
guard v >= 0 && v <= 100 else { return nil }
value = Int8(v)
}
}
(Note: use a struct, not a class for a base value like that)
If you do that a lot, you can improve that a little using protocols, like so:
protocol RestrictedValue {
associatedtype T : Comparable
static var range : ( T, T ) { get }
var value : T { set get }
init() // hack to make it work
}
extension RestrictedValue {
init?(v: T) {
self.init()
guard Self.range.0 <= v && Self.range.1 >= v else { return nil }
value = v
}
}
struct IntPercent : RestrictedValue {
static var range = ( 0, 100 )
var value : Int = 0
init() {}
}
I don't think you can use Generics to limit base type values.
But I bet there is an even better solution - this one is definitely not awezome :-)

Constraining the value of an object is not the same as a constrained type. Constraining values of numbers doesn't really make sense, as the things you are talking about are just numbers -- there is no such thing as a percent-number or a Bribor-Interbank-Interest-rate-number; they are just numbers. If you want to constrain their value, you do it wherever you get or use the numbers. It doesn't make sense to define a new type simply to constrain the values of an existing type.

Related

How to calculate percentage with generics in Swift?

I would do following conversion if width == 200 then
20 -> 40
20.0 -> 40.0
So it should work for both Int and Double.
let width = 333
func unitX<T>(x: T) -> T {
return width * x / 100
}
but get this error:
Binary operator '*' cannot be applied to operands of type 'Int' and 'T'
What can I do?
...If Generic is really the way you want to go with this:
You'll want to start by defining some constraints on T to constrain it to a type that make sense to take a percentage of. From there, you'll simply need to iterate the possibilities and cast, or define your own operator on a protocol you're conforming to.
You can read about custom operators here: https://docs.swift.org/swift-book/LanguageGuide/AdvancedOperators.html
In theory you could do something like this:
protocol Percentable {
var value: Double {get, set}
}
extension Percentable {
static prefix func % () -> Percentable {
// DO percentage logic here
return value
}
}
And then you could call this like the following:
func unitX<T>(x: T) -> T where T: Percentable {
return x %
}
Note: You would have to then make an extension on Int or Double to be Percentable.
Recap
This would be a horrible approach since you would simply be converting from and Int to a Double back to an Int. It would make more sense (like mentioned in the comments) to simply overload Ints to divide by Doubles and visa-versa to keep your precision. Generics are not a good use here.

How to Define Generic “Invalid“ Values for Different Types

In my app, I am using Integers, Doubles, Floats, and CGFloats to represent a number of different values. According to my app’s semantic, these values may become “invalid“, a state which I represent using a reserved value, i. e. -1. The simplest approach to make this usable in code would be this:
anIntVariable = -1
aFloatVariable = -1
aDoubleVariable = -1.0
...
To get away from this convention driven approach and increase readability and adaptability I defined a number of extensions:
extension Int {
static var invalid = -1
}
extension Float {
static var invalid = -1.0
}
extension Double {
static var invalid = -1.0
}
...
So the above code would now read:
anIntVariable = .invalid
aFloatVariable = .invalid
aDoubleVariable = .invalid
...
It does work. However, I’m not really happy with this approach. Does anyone of you have an idea for a better way of expressing this?
To add some complexity, in addition to simple types like Int, Float, or Double, I also use Measurement based types like this:
let length = Measurement(value: .invalid, unit: UnitLength.baseUnit())
Extra bonus point if you find a way to include “invalid“ measurements in your solution as well...
Thanks for helping!
Some Additional Thoughts
I know I could use optionals with nil meaning “invalid”. In this case, however, you’d have additional overhead with conditional unwrapping... Also, using nil as “invalid” is yet another convention.
It isn’t better or worse, just different. Apple uses “invalid” values in its own APIs, i. e. the NSTableViewmethod row(for:) will return -1 if the view is not in the table view. I agree, however, that this very method perfectly illustrates that returning an optional would make a lot of sense...
I'd use optionals for that.
If you want lack of value and invalid value to be different states in your app, i'd suggest creating a wrapper for your values:
enum Validatable<T> {
case valid(T)
case invalid
}
And use it like that:
let validValue : Validatable<Int> = .valid(5)
let invalidValue : Validatable<Int> = .invalid
var validOptionalDouble : Validatable<Double?> = .valid(nil)
validOptionalDouble = .valid(5.0)
let measurement : Validatable<Measurement> = .invalid
etc.
You can then check for value by switch on that enum to access the associated value like this:
switch validatableValue {
case .valid(let value):
//do something with value
case .invalid:
//handle invalid state
}
or
if case .valid(let value) = validatableValue {
//handle valid state
}
etc

Subclassing Swift Double / Operator Overloading typealias

I'd like to be able to subclass Double with some custom types in my swift code so I can do some introspection and operator overloading later on.
This is semantically what I want to be able to write:
class Frequency: Double {}
class Period: Double {
init(_ frequency: Frequency) {
let period: Double = 1 / frequency
self.init(period)
}
}
let a = Double(1)
print(type(of: a)) // Double
let b = Frequency(2)
print(type(of: b)) // Frequency
let c = Period(a)
print(type(of: c)) // Period == 1
let d = Period(b)
print(type(of: d)) // Period == 0.5
Feel like what I'm trying to do should be posible as Swift is a strictly typed language.
I've looked at typealiases as well, but you can't operator overload with those. Also looked at the FloatingPoint protocol but doesn't seem to help me.
while this is not possible, I created a class a while ago, which addressed a similar issue. I was in need of a polyvalent variable class, for ease of synthax in currency strings, and ended up with something like bellow. So far it's working great, and I've been using it as mortar for many advanced subclasses i've built since then. It does what you wish, which if you can see in the Frequency subclass, becomes a matter of tweaking the init override for each use case.
While the class is large, and the methods bulky, feel free to tweak and modify however you see fit, or if you find simpler approaches. I uploaded it to a gist file here so it can be read easily.
Link to the class.
When used with your use case, it allows for the following, which seems to be what you want:
class Frequency : MultiVar {
override init(_ value: Any?) {
super.init(value)
let current = double
guard current != 0.0 else {
print("Frequency Error: Something went wrong while subclassing \(self), established variable 'double' is equal to 0!")
return
}
double = 1 / current
}
}
let freq = Frequency(10)
print(freq.string) //prints 0.1
print(freq.double) //prints 0.1

Swift Generic "Numeric" Protocol that Validates Numeric Values

Swift 3.
Ultimately my functions need to receive UInt8 data types, but I'm never sure if the arguments I will receive from callers will be Int, Int64, UInt, Float, etc. I know they will be numeric types, I just don't know which flavor.
I could do:
func foo(value: Int) { }
func foo(value: Float) {}
func foo(value: UInt) {}
But that's crazy. So I thought I could do something like create a protocol
protocol ValidEncodable {
}
And then pass in types that conform:
func foo(value: ValidEncodable) { }
And then in that function I could get my values into the correct format
func foo(value: ValidEncoable) -> UInt8 {
let correctedValue = min(max(floor(value), 0), 100)
return UInt8(correctedValue)
}
I'm really struggling to figure out
1) How to create this ValidEncodable protocol that contains all the numeric types
2) And how to do things like floor(value) when the value I get is an Int without iterating over every possible numeric type (i.e. (floor(x) is only available on floating-point types)
Ultimately I need the values to be UInt8 in the range of 0-100. The whole reason for this madness is that I'm parsing XML files to my own internal data structures and I want to bake in some validation to my values.
This can be done without a protocol, and by making use of compiler checks, which greatly reduces the changes of bugs.
My recommendation is to use a partial function - i.e. a function that instead of taking an int, it takes an already validated value. Check this article for a more in-depth description of why partial functions are great.
You can build a Int0to100 struct, which has either a failable or throwable initializer (depending on taste):
struct Int0to100 {
let value: UInt8
init?(_ anInt: Int) {
guard anInt >= 0 && anInt <= 100 else { return nil }
value = UInt8(anInt)
}
init?(_ aFloat: Float) {
let flooredValue = floor(aFloat)
guard flooredValue >= 0 && flooredValue <= 100 else { return nil }
value = UInt8(flooredValue)
}
// ... another initializers can be added the same way
}
and change foo to allow to be called with this argument instead:
func foo(value: Int0to100) {
// do your stuff here, you know for sure, at compile time,
// that the function can be called with a valid value only
}
You move to the caller the responsibility of validating the integer value, however the validation resolves to checking an optional, which is easy, and allows you to handle the scenario of an invalid number with minimal effort.
Another important aspect is that you explicitly declare the domain of the foo function, which improves the overall design of the code.
And not last, enforcements set at compile time greatly reduce the potential of having runtime issues.
If you know your incoming values will lie in 0..<256, then you can just construct a UInt8 and pass it to the function.
func foo(value: UInt8) -> UInt8 { return value }
let number = arbitraryNumber()
print(foo(UInt8(number)))
This will throw a runtime exception if the value is too large to fit in a byte, but otherwise will work. You could protect against this type of error by doing some bounds checking between the second and third lines.

Type 'Int' does not conform to protocol 'BooleanType'?

I know there is another thread with the same question, but it doesn't tell what is actually causing the problem
Im new to swift, so Im a bit confused on this.
I wrote a very simple program that is supposed to start with a default number of followers (0) and assign that to 'defaultfollowers' and once that becomes 1 its supposed become "followers", but I get the error "Type 'Int' does not conform to protocol 'BooleanType'". What is causing this and why
var followerdeafault = 0
var followers = 0
if (followerdeafault++){
var followers = followerdeafault
}
In Swift you can't implicitly substitute Int instead of Bool. This was done to prevent confusion and make code more readable.
So instead of this
let x = 10
if x { /* do something */ }
You have to write this:
let x = 10
if x != 0 { /* do something */ }
Also you can't pass an Optional instead of Bool to check if it's nil, as you would do in Objective-C. Use explicit comparison instead:
if myObject != nil { /* do something */ }
As the comments said, you're trying to use an Int in a Bool comparison statement. What you're looking for is probably something like this:
if followerdeafuaut++ == 1 { ... }
Also side note: the ++ operator is deprecated, moving towards using +=