Swift number formatting - swift

I am just starting to get to know Swift but I am having a serious problem with number formatting at an extremely basic level.
For example, I need to display an integer with at least 2 digits (e.g. 00, 01, 02, 03, 04, 05 ...). The normal syntax I'd expect would be something like:
println(" %02i %02i %02i", var1, var2, var3);
...but I don't find any clear instruction for how to achieve this in Swift. I find it really hard to believe that I need to create a custom function to do that. The same for returning a float or double value to a fixed number of decimal places.
I've found links to a couple of similar questions (Precision String Format Specifier In Swift & How to use println in Swift to format number) but they seem to mix objective C and even talk about Python and using unity libraries. Is there no Swift solution to this basic programming need? Is it really true that something so fundamental has been completely overlooked in Swift?

You can construct a string with a c-like formatting using this constructor:
String(format: String, arguments:[CVarArgType])
Sample usage:
var x = 10
println(String(format: "%04d", arguments: [x])) // This will print "0010"
If you're going to use it a lot, and want a more compact form, you can implement an extension like this:
extension String {
func format(arguments: [CVarArgType]) -> String {
return String(format: self, arguments: arguments)
}
}
allowing to simplify its usage as in this example:
"%d apples cost $%03.2f".format([4, 4 * 0.33])

Here's a POP solution to the problem:
protocol Formattable {
func format(pattern: String) -> String
}
extension Formattable where Self: CVarArg {
func format(pattern: String) -> String {
return String(format: pattern, arguments: [self])
}
}
extension Int: Formattable { }
extension Double: Formattable { }
extension Float: Formattable { }
let myInt = 10
let myDouble: Double = 0.01
let myFloat: Float = 1.11
print(myInt.format(pattern: "%04d")) // "0010
print(myDouble.format(pattern: "%.2f")) // "0.01"
print(myFloat.format(pattern: "$%03.2f")) // "$1.11"
print(100.format(pattern: "%05d")) // "00100"

You can still use good ole NSLog("%.2f",myFloatOrDouble) too :D

There is a simple solution I learned with "We <3 Swift", which is so easy you can even use without Foundation, round() or Strings, keeping the numeric value.
Example:
var number = 31.726354765
var intNumber = Int(number * 1000.0)
var roundedNumber = Double(intNumber) / 1000.0
result: 31.726

Related

Round decimal to nearest increment given a number

I would like to round down a decimal to the nearest increment of another number. For example, given a value of 2.23678301 and an increment of 0.0001, I would like to round this to 2.2367. Sometimes the increment could be something like 0.00022, in which case the value would be rounded down to 2.23674.
I tried to do this, but sometimes the result is not correct and tests aren't passing:
extension Decimal {
func rounded(byIncrement increment: Self) -> Self {
var multipleOfValue = self / increment
var roundedMultipleOfValue = Decimal()
NSDecimalRound(&roundedMultipleOfValue, &multipleOfValue, 0, .down)
return roundedMultipleOfValue * increment
}
}
/// Tests
class DecimalTests: XCTestCase {
func testRoundedByIncrement() {
// Given
let value: Decimal = 2.2367830187654
// Then
XCTAssertEqual(value.rounded(byIncrement: 0.00010000), 2.2367)
XCTAssertEqual(value.rounded(byIncrement: 0.00022), 2.23674)
XCTAssertEqual(value.rounded(byIncrement: 0.0000001), 2.236783)
XCTAssertEqual(value.rounded(byIncrement: 0.00000001), 2.23678301) // XCTAssertEqual failed: ("2.23678301") is not equal to ("2.236783009999999744")
XCTAssertEqual(value.rounded(byIncrement: 3.5), 0)
XCTAssertEqual(value.rounded(byIncrement: 0.000000000000001), 2.2367830187654) // XCTAssertEqual failed: ("2.2367830187653998323726489726140416") is not equal to ("2.236783018765400576")
}
}
I'm not sure why the decimal calculations are making up numbers that were never there, like the last assertion. Is there a cleaner or more accurate way to do this?
Your code is fine. You're just calling it incorrectly. This line doesn't do what you think:
let value: Decimal = 2.2367830187654
This is equivalent to:
let value = Decimal(double: Double(2.2367830187654))
The value is first converted to a Double, binary rounding it to 2.236783018765400576. That value is then converted to a Decimal.
You need to use the string initializer everywhere you want a Decimal from a digit string:
let value = Decimal(string: "2.2367830187654")!
XCTAssertEqual(value.rounded(byIncrement: Decimal(string: "0.00000001")!), Decimal(string: "2.23678301")!)
etc.
Or you can use the integer-based initializers:
let value = Decimal(sign: .plus, exponent: -13, significand: 22367830187654)
In iOS 15 there are some new initializers that don't return optionals (init(_:format:lenient:) for example), but you're still going to need to pass Strings, not floating point literals.
You could also do this, though it may be confusing to readers, and might lead to bugs if folks take the quotes away:
extension Decimal: ExpressibleByStringLiteral {
public init(stringLiteral value: String) {
self.init(string: value)!
}
}
let value: Decimal = "2.2367830187654"
XCTAssertEqual(value.rounded(byIncrement: "0.00000001"), "2.23678301")
For test code, that's probably nice, but I'd be very careful about using it in production code.

Subclassing Swift Double / Operator Overloading typealias

I'd like to be able to subclass Double with some custom types in my swift code so I can do some introspection and operator overloading later on.
This is semantically what I want to be able to write:
class Frequency: Double {}
class Period: Double {
init(_ frequency: Frequency) {
let period: Double = 1 / frequency
self.init(period)
}
}
let a = Double(1)
print(type(of: a)) // Double
let b = Frequency(2)
print(type(of: b)) // Frequency
let c = Period(a)
print(type(of: c)) // Period == 1
let d = Period(b)
print(type(of: d)) // Period == 0.5
Feel like what I'm trying to do should be posible as Swift is a strictly typed language.
I've looked at typealiases as well, but you can't operator overload with those. Also looked at the FloatingPoint protocol but doesn't seem to help me.
while this is not possible, I created a class a while ago, which addressed a similar issue. I was in need of a polyvalent variable class, for ease of synthax in currency strings, and ended up with something like bellow. So far it's working great, and I've been using it as mortar for many advanced subclasses i've built since then. It does what you wish, which if you can see in the Frequency subclass, becomes a matter of tweaking the init override for each use case.
While the class is large, and the methods bulky, feel free to tweak and modify however you see fit, or if you find simpler approaches. I uploaded it to a gist file here so it can be read easily.
Link to the class.
When used with your use case, it allows for the following, which seems to be what you want:
class Frequency : MultiVar {
override init(_ value: Any?) {
super.init(value)
let current = double
guard current != 0.0 else {
print("Frequency Error: Something went wrong while subclassing \(self), established variable 'double' is equal to 0!")
return
}
double = 1 / current
}
}
let freq = Frequency(10)
print(freq.string) //prints 0.1
print(freq.double) //prints 0.1

Swift 3 String has no member components [duplicate]

So I'm trying to prepare myself for coding interviews by doing HackerRank's test case samples. If you're familiar with the process, you usually take a standard input that has various lines of strings and you extract the information based on what the question is asking. I have come across numerous questions where they will give you a line (as a String) with n number of integers separated by a space (i.e. 1 2 3 4 5). In order to solve the problem I need to extrapolate an array of Int ([Int]) from a String. I came up with this nifty method:
func extractIntegers(_ s: String) -> [Int] {
let splits = s.characters.split { [" "].contains(String($0)) }
return splits.map { Int(String($0).trimmingCharacters(in: .whitespaces))! }
}
So I code it in my Playground and it works fantastic, I even run multiple test cases I make up, and they all pass with flying colors...then I copy the code to HackerRank and try running it for submission. And I get this:
solution.swift:16:29: error: value of type 'String' has no member 'trimmingCharacters'
return splits.map { Int(String($0).trimmingCharacters(in: .whitespaces))! }
So... okay maybe HR hasn't updated everything for Swift 3 yet. No big deal! I have an idea for an even cleaner solution! Here it is:
func extractIntegers(_ s: String) -> [Int] {
return s.components(separatedBy: " ").map { Int($0)! }
}
....AAAAANDDD of course:
solution.swift:15:12: error: value of type 'String' has no member 'components'
return s.components(separatedBy: " ").map { Int($0)! }
So now I'm forced to use a really sloppy method where I loop through all the characters, check for spaces, append substrings from ranges between spaces into an array, and then map that array and return it.
Does anyone have any other clean ideas to work around HR's inadequacies with Swift? I would like any recommendations I can get!
Thanks in advance!
The String methods
func trimmingCharacters(in set: CharacterSet) -> String
func components(separatedBy separator: String) -> [String]
are actually methods of the NSString class, defined in the Foundation
framework, and "bridged" to Swift. Therefore, to make your code compile,
you have go add
import Foundation
But a slightly simplified version of your first method compiles
with pure Swift, without importing Foundation. I handles leading, trailing, and intermediate whitespace:
func extractIntegers(_ s: String) -> [Int] {
let splits = s.characters.split(separator: " ").map(String.init)
return splits.map { Int($0)! }
}
let a = extractIntegers(" 12 234 -567 4 ")
print(a) // [12, 234, -567, 4]
Update for Swift 4 (and simplified):
func extractIntegers(_ s: String) -> [Int] {
return s.split(separator: " ").compactMap { Int($0) }
}

Swift convert string to UnsafeMutablePointer<Int8>

I have a C function mapped to Swift defined as:
func swe_set_eph_path(path: UnsafeMutablePointer<Int8>) -> Void
I am trying to pass a path to the function and have tried:
var path = [Int8](count: 1024, repeatedValue: 0);
for i in 0...NSBundle.mainBundle().bundlePath.lengthOfBytesUsingEncoding(NSUTF16StringEncoding)-1
{
var range = i..<i+1
path[i] = String.toInt(NSBundle.mainBundle().bundlePath[range])
}
println("\(path)")
swe_set_ephe_path(&path)
but on the path[i] line I get the error:
'subscript' is unavailable: cannot subscript String with a range of
Int
swe_set_ephe_path(NSBundle.mainBundle().bundlePath)
nor
swe_set_ephe_path(&NSBundle.mainBundle().bundlePath)
don't work either
Besides not working, I feel there has got to be a better, less convoluted way of doing this. Previous answers on StackOverflow using CString don't seem to work anymore. Any suggestions?
Previous answers on StackOverflow using CString don't seem to work anymore
Nevertheless, UnsafePointer<Int8> is a C string. If your context absolutely requires an UnsafeMutablePointer, just coerce, like this:
let s = NSBundle.mainBundle().bundlePath
let cs = (s as NSString).UTF8String
var buffer = UnsafeMutablePointer<Int8>(cs)
swe_set_ephe_path(buffer)
Of course I don't have your swe_set_ephe_path, but it works fine in my testing when it is stubbed like this:
func swe_set_ephe_path(path: UnsafeMutablePointer<Int8>) {
println(String.fromCString(path))
}
In current version of Swift language you can do it like this (other answers are outdated):
let path = Bundle.main.bundlePath
let param = UnsafeMutablePointer<Int8>(mutating: (path as NSString).utf8String)
It’s actually extremely irritating of the library you’re using that it requires (in the C declaration) a char * path rather than const char * path. (this is assuming the function doesn’t mutate the input string – if it does, you’re in a whole different situation).
If it didn’t, the function would come over to Swift as:
// note, UnsafePointer not UnsafeMutablePointer
func swe_set_eph_path(path: UnsafePointer<Int8>) -> Void
and you could then rely on Swift’s implicit conversion:
let str = "blah"
swe_set_eph_path(str) // Swift implicitly converts Strings
// to const C strings when calling C funcs
But you can do an unsafe conversion quite easily, in combination with the withCString function:
str.withCString { cstr in
swe_set_eph_path(UnsafeMutablePointer(cstr))
}
I had a static library (someLibrary.a) written in C++ compiled for iOS.
The header file (someLibrary.h) had a function exposed like this:
extern long someFunction(char* aString);
The declaration in Swift looks like this:
Int someFunction(aString: UnsafeMutablePointer<Int8>)
I made an extension to String:
extension String {
var UTF8CString: UnsafeMutablePointer<Int8> {
return UnsafeMutablePointer((self as NSString).UTF8String)
}
}
So then I can call the method like so:
someFunction(mySwiftString.UTF8CString)
Update: Make String extension (swift 5.7)
extension String {
var UTF8CString: UnsafeMutablePointer<Int8> {
return UnsafeMutablePointer(mutating: (self as NSString).utf8String!)
}
}

Swift - Resolving a math operation in a string

just a short question. In Swift it is possible to solve the following code:
var a: String;
a = "\(3*3)";
The arithmetic operation in the string will be solved. But i can´t figure out, why this following variation doesn´t work.
var a: String;
var b: String;
b = "3*3";
a = "\(b)";
In this case the arithmetic operation in var a will not be resolved. Any ideas why and how i can this get to work. Some things would be much more easier if this would work. Thanks for your answers.
In the second case, you are interpolating a string, not an arithmetic expression. In your example, it's a string you chose at compile time, but in general it might be a string from the user, or loaded from a file or over the web. In other words, at runtime b could contain some arbitrary string. The compiler isn't available at runtime to parse an arbitrary string as arithmetic.
If you want to evaluate an arbitrary string as an arithmetic formula at runtime, you can use NSExpression. Here's a very simple example:
let expn = NSExpression(format:"3+3")
println(expn.expressionValueWithObject(nil, context: nil))
// output: 6
You can also use a third-party library like DDMathParser.
Swift 4.2
let expn = "3+3"
print(expn.expressionValue(with: nil, context: nil))
But I also have a solution thats not the most effective way but could be used in some cases if your sure it's only "y+x" and not longer string.
var yNumber: Int!
var xNumber: Int!
let expn: String? = "3+3"
// Here we take to first value in the expn String.
if let firstNumber = expo?.prefix(1), let myInt = Int(firstNumber){
// This will print (Int : 3)
print("Int : \(myInt)")
// I set the value to yNumber
yNumber = myInt
}
// Here we take the last value in the expn string
if let lastNumber = optionalString?.suffix(1), let myInt = Int(lastNumber){
// This will print (Int : 3)
print("Int : \(myInt)")
// I set the value to xNumber
xNumber = myInt
}
// Now you can take the two numbers and add
print(yNumber + xNumber)
// will print (6)
I can't recommend this but it works in some cases
This won't be solved because this is not an arithmetic operation, this is a string:
"3*3"
the same as this
"String"
Everything you put in " it's a string.
The second example lets you construct a new String value from a mix of constants, variables, literals, and expressions:
"\(3*3)"
this is possible because of string interpolation \()
You inserted a string expression which swing convert and create expected result.
You can try to use evaluatePostfixNotationString method from that class.
The whole project is about recognizing math expression from camera image and calculating it after.