Missing argument for parameter 'initialValue' in property wrapper initializer; add 'wrappedValue' and 'initialValue' arguments in - swift

I'd like to create a property wrapper to accommodate the well known precision issues. However, when I use the #PropertyWrapper as I understand it to be used and demonstrated here, I get the following errors:
Extra argument in call
Missing argument for parameter 'initialValue' in property wrapper initializer; add 'wrappedValue' and 'initialValue' arguments in '#MantissaClamping(...)'
I don't see how I have an "extra argument in call," I assign the decimal as the wrapped value, and I provide the integer literal as the mantissa argument.
After saying I have an extra argument, the other error says I'm missing an argument. And suggesting I think suggesting that I literally add them as arguments to the property wrapper...That would defeat the whole purpose of property wrappers in my eyes, because it would require redundant code like this...But even this doesn't work.
struct MantissaClampTestStruct {
#MantissaClamping(Decimal("0.000000000001")!, 14) var small: Decimal = Decimal("0.000000000001")!
}
How can I assign a literal value to the property, and let that apply to the property wrapper? While providing the int value that directly applies to the property wrapper?
Here is my reproducible code you can put in a playground.
extension Decimal {
/// Rounds a value
/// - Parameters:
/// - roundingMode: up down plain or bankers
/// - scale: the number of digits result can have after its decimal point
/// - Returns: the rounded number
func rounded(_ roundingMode: NSDecimalNumber.RoundingMode = .bankers, scale: Int = 0) -> Self {
var result = Self()
var number = self
NSDecimalRound(&result, &number, scale, roundingMode)
return result
}
}
#propertyWrapper
struct MantissaClamping {
var value: Decimal
let mantissaCount: Int
init(initialValue value: Decimal, _ mantissaCount: Int) {
precondition(mantissaCount < 19 && mantissaCount >= 0)
self.value = value
self.mantissaCount = mantissaCount
}
var wrappedValue: Decimal {
get { value }
set { value = newValue.rounded(.down, scale: mantissaCount)}
}
}
struct MantissaClampTestStruct {
#MantissaClamping(14) var small: Decimal = Decimal("0.000000000001")!
}

According to the docs:
When you include property wrapper arguments, you can also specify an initial value using assignment. Swift treats the assignment like a wrappedValue argument and uses the initializer that accepts the arguments you include.
So it translates your property declaration into something like:
var small = MantissaClamping(wrappedValue: Decimal("0.000000000001")!, 14)
Obviously, this doesn't match any of your initialisers.
Just rename the parameter label to wrappedValue:
init(wrappedValue value: Decimal, _ mantissaCount: Int) {
And also add the string: label to the Decimal initialiser, which you have missed:
#MantissaClamping(14) var small: Decimal = Decimal(string: "0.000000000001")!
You might also want to round the initial value too:
init(wrappedValue value: Decimal, _ mantissaCount: Int) {
precondition(mantissaCount < 19 && mantissaCount >= 0)
// here
self.value = value.rounded(.down, scale: mantissaCount)
self.mantissaCount = mantissaCount
}

Related

Round decimal to nearest increment given a number

I would like to round down a decimal to the nearest increment of another number. For example, given a value of 2.23678301 and an increment of 0.0001, I would like to round this to 2.2367. Sometimes the increment could be something like 0.00022, in which case the value would be rounded down to 2.23674.
I tried to do this, but sometimes the result is not correct and tests aren't passing:
extension Decimal {
func rounded(byIncrement increment: Self) -> Self {
var multipleOfValue = self / increment
var roundedMultipleOfValue = Decimal()
NSDecimalRound(&roundedMultipleOfValue, &multipleOfValue, 0, .down)
return roundedMultipleOfValue * increment
}
}
/// Tests
class DecimalTests: XCTestCase {
func testRoundedByIncrement() {
// Given
let value: Decimal = 2.2367830187654
// Then
XCTAssertEqual(value.rounded(byIncrement: 0.00010000), 2.2367)
XCTAssertEqual(value.rounded(byIncrement: 0.00022), 2.23674)
XCTAssertEqual(value.rounded(byIncrement: 0.0000001), 2.236783)
XCTAssertEqual(value.rounded(byIncrement: 0.00000001), 2.23678301) // XCTAssertEqual failed: ("2.23678301") is not equal to ("2.236783009999999744")
XCTAssertEqual(value.rounded(byIncrement: 3.5), 0)
XCTAssertEqual(value.rounded(byIncrement: 0.000000000000001), 2.2367830187654) // XCTAssertEqual failed: ("2.2367830187653998323726489726140416") is not equal to ("2.236783018765400576")
}
}
I'm not sure why the decimal calculations are making up numbers that were never there, like the last assertion. Is there a cleaner or more accurate way to do this?
Your code is fine. You're just calling it incorrectly. This line doesn't do what you think:
let value: Decimal = 2.2367830187654
This is equivalent to:
let value = Decimal(double: Double(2.2367830187654))
The value is first converted to a Double, binary rounding it to 2.236783018765400576. That value is then converted to a Decimal.
You need to use the string initializer everywhere you want a Decimal from a digit string:
let value = Decimal(string: "2.2367830187654")!
XCTAssertEqual(value.rounded(byIncrement: Decimal(string: "0.00000001")!), Decimal(string: "2.23678301")!)
etc.
Or you can use the integer-based initializers:
let value = Decimal(sign: .plus, exponent: -13, significand: 22367830187654)
In iOS 15 there are some new initializers that don't return optionals (init(_:format:lenient:) for example), but you're still going to need to pass Strings, not floating point literals.
You could also do this, though it may be confusing to readers, and might lead to bugs if folks take the quotes away:
extension Decimal: ExpressibleByStringLiteral {
public init(stringLiteral value: String) {
self.init(string: value)!
}
}
let value: Decimal = "2.2367830187654"
XCTAssertEqual(value.rounded(byIncrement: "0.00000001"), "2.23678301")
For test code, that's probably nice, but I'd be very careful about using it in production code.

Wrapped property with #propertyWrapper won't return wrapped type

In this simplified example, I make a propertyWrapper of an UInt to hold a Natural Number (an Integer > 0). My own example uses a more complex filter but this shows the issue. Rather than find a workaround, the point of the question is to shed light on the confusing (to me) error.
Assigning it to a simple UInt brings the error message listed.
Using its wrappedValue property as in the following line works fine. But surely the whole point of the wrapping is to be able to treat it as an UInt as returned by the get?
The error "cannot assign value of type 'NonZero' to type 'UInt" appears to undermine the whole point of the wrapper type. What am I misunderstanding?
Xcode 11.0
import Foundation
#propertyWrapper
struct NonZero {
private let myNumber : UInt
init(n : UInt)
{
if ( n == 0){fatalError(" cannot be 0")}
myNumber = n
}
var wrappedValue: UInt {
get { return myNumber }
}
}
struct Nums {
var num :UInt
init( nz: NonZero){
num = nz //error message "cannot assign value of type 'NonZero' to type 'UInt"
num = nz.wrappedValue //no error
}
}
This is not how property wrappers work. The code:
init(nz: NonZero)
declares nz to have the type NonZero – which is just the struct defined earlier, NonZero here is not being used as a wrapped property.
A wrapped property is used as an attribute on a property declaration, for example:
#NonZero var num : UInt = 1
[Putting that in the code requires changing the parameter label of NonZero's init to be changed to wrappedValue, e.g.:
init(wrappedValue : Uint) { ... wrappedValue ... }
]
You could write the init of struct Nums as:
init(nz: UInt)
{
num = nz // ok *after* you add a `set` to `NonZero`, will `fatalError` if `nz` is zero
let unwrapped : UInt = num // ok
print(unwrapped) // so you see it worked
}
Keep exploring! You might find SE-0259 Property Wrappers (Apple) and Swift Property Wrappers (NSHipster) useful.
HTH

Cannot match values of type

I'm just starting out with Swift 3 and I'm converting a Rails project to swift (side project while I learn)
Fairly simple, I have a Rails statement Im converting and Im getting many red errors in Xcode:
let startingPoint: Int = 1
let firstRange: ClosedRange = (2...10)
let secondRange: ClosedRange = (11...20)
func calc(range: Float) -> Float {
switch range {
case startingPoint:
return (range - startingPoint) * 1 // or 0.2
case firstRange:
return // code
default:
return //code
}
}
calc will either have an Int or Float value: 10 or 10.50
Errors are:
Expression pattern of type ClosedRange cannot match values of type Float
Binary operator - cannot be applied to operands of type Float and Int
I understand the errors but I dont know what to search for to correct it. Could you point me in the right direction, please?
Swift is strongly typed. Whenever you use a variable or pass something as a function argument, Swift checks that it is of the correct type. You can't pass a string to a function that expects an integer etc. Swift does this check at compile time (since it's statically typed).
To adhere by that rules, try changing your code to this:
let startingPoint: Float = 1
let firstRange: ClosedRange<Float> = (2...10)
let secondRange: ClosedRange<Float> = (11...20)
func calc(range: Float) -> Float {
switch range {
case startingPoint:
return (range - startingPoint) * 1 // or 0.2
case firstRange:
return 1.0 // 1.0 is just an example, but you have to return Float since that is defined in the method
default:
return 0.0 // 0.0 is just an example, put whatever you need here
}
}
For the first error, you might want to specify ClosedRange to be of type Floats. Something similar to:
let firstRange: ClosedRange<Float> = (2...10)
For the second error, the problem is you are trying to compare a Float (range:Float) with an Int (startingPoint). So I would suggest you convert the startingPoint variable to a Float as well.

Why converting Double to Int doesn't return optional Int in Swift?

Converting a String to Int returns an optional value but converting a Double to Int does not return an optional value. Why is that? I wanted to check if a double value is bigger than maximum Int value, but because converting function does not return an optional value, I am not be able to check by using optional binding.
var stringNumber: String = "555"
var intValue = Int(stringNumber) // returns optional(555)
var doubleNumber: Double = 555
var fromDoubleToInt = Int(doubleNumber) // returns 555
So if I try to convert a double number bigger than maximum Integer, it crashes instead of returning nil.
var doubleNumber: Double = 55555555555555555555
var fromDoubleToInt = Int(doubleNumber) // Crashes here
I know that there's another way to check if a double number is bigger than maximum Integer value, but I'm curious as why it's happening this way.
If we consider that for most doubles, a conversion to Int simply means dropping the decimal part:
let pieInt = Int(3.14159) // 3
Then the only case in which the Int(Double) constructor returns nil is in the case of an overflow.
With strings, converting to Int returns an optional, because generally, strings, such as "Hello world!" cannot be represented as an Int in a way that universally makes sense. So we return nil in the case that the string cannot be represented as an integer. This includes, by the way, values that can be perfectly represented as doubles or floats:
Consider:
let iPi = Int("3.14159")
let dPi = Double("3.14159")
In this case, iPi is nil while dPi is 3.14159. Why? Because "3.14159" doesn't have a valid Int representation.
But meanwhile, when we use the Int constructor which takes a Double and returns non-optional, we get a value.
So, if that constructor is changed to return an optional, why would it return 3 for 3.14159 instead of nil? 3.14159 can't be represented as an integer.
But if you want a method that returns an optional Int, returning nil when the Double would overflow, you can just write that method.
extension Double {
func toInt() -> Int? {
let minInt = Double(Int.min)
let maxInt = Double(Int.max)
guard case minInt ... maxInt = self else {
return nil
}
return Int(self)
}
}
let a = 3.14159.toInt() // returns 3
let b = 555555555555555555555.5.toInt() // returns nil
Failable initializers and methods with Optional return types are designed for scenarios where you, the programmer, can't know whether a parameter value will cause failure, or where verifying that an operation will succeed is equivalent to performing the operation:
let intFromString = Int(someString)
let valueFromDict = dict[someKey]
Parsing an integer from a string requires checking the string for numeric/non-numeric characters, so the check is the same as the work. Likewise, checking a dictionary for the existence of a key is the same as looking up the value for the key.
By contrast, certain operations are things where you, the programmer, need to verify upfront that your parameters or preconditions meet expectations:
let foo = someArray[index]
let bar = UInt32(someUInt64)
let baz: UInt = someUInt - anotherUInt
You can — and in most cases should — test at runtime whether index < someArray.count and someUInt64 < UInt32.max and someUInt > anotherUInt. These assumptions are fundamental to working with those kinds of types. On the one hand, you really want to design around them from the start. On the other, you don't want every bit of math you do to be peppered with Optional unwrapping — that's why we have types whose axioms are stated upfront.

Xcode6 beta 4 downcast to float

I have a Dictionary with a String and AnyObject, so [String: AnyObject].
In a function I want to check the type of the dict value. So this code worked in Xcode6-Beta 3:
for (key, value: AnyObject) in contents {
...
} else if value is Float {
stringValue = String(value as Float) + ","
Now I get the error: AnyObject is not convertible to Float
stringValue = String(Float(value)) + "," doesn't work as well.
Any ideas?
There is no problem with casting AnyObject to Float.
Converting AnyObject to Float has no problem as you can see below instruction will execute without errors.
var f:Float = value as Float
As swift String has no intializer to convert for Float
If you do
var str:String = String(f) //This will show error as swift has no intializer for Float
But Swift has only added intializer to String for Int to convert directly.There is not intializer for Float.
var i:Int = value as Int
var str:String = String(i) //this will run fine
Now to solve your problem you can do
for (key, value: AnyObject) in contents {
if(value is Int){
}else if value is Float {
//Take this in var and use
var stringValue = "\(value as Float),"
}
}
In future swift may add the intializer for Float but currently there is no intializer.
Replace "as" with "as!" to force downcast.
Please, remember that you can use the forced form of the type cast operator (as!) only when you are sure that the downcast will always succeed, otherwise a runtime error will be triggered. In your particular case there will be no problem since there is a previous checking (if value is Float).
You can't cast to float because:
AnyObject can represent an instance of any class type.
From Swift Programming Guide
But float isn't a class. You will have to use Any instead:
Any can represent an instance of any type at all, apart from function types.
From Swift Programming Guide
This should work (but I can't test on my current Mac):
for (key, value: Any) in contents {
...
} else if value is Float {
stringValue = String(value as Float) + ","
As wumm said, you can't directly cast to Float because Float is not a class type. You may be able to cast it to an NSNumber and bridge to a Float though. Try
value as NSNumber as Float
This strategy works with casting AnyObject into a Swift String.
Using your code and fixing the concatenation it would look like:
for (key, value: AnyObject) in contents {
if value is Float {
stringValue = "\(stringValue) \(value as NSNumber as Float) ,"
}
}