Number validation and formatting - swift

I want to format, in real time, the number entered into a UITextField. Depending on the field, the number may be an integer or a double, may be positive or negative.
Integers are easy (see below).
Doubles should be displayed exactly as the user enters with three possible exceptions:
If the user begins with a decimal separator, or a negative sign followed by a decimal separator, insert a leading zero:
"." becomes "0."
"-." becomes "-0."
Remove any "excess" leading zeros if the user deletes a decimal point:
If the number is "0.00023" and the decimal point is deleted, the number should become "23".
Do not allow a leading zero if the next character is not a decimal separator:
"03" becomes "3".
Long story short, one and only one leading zero, no trailing zeros.
It seemed like the easiest idea was to convert the (already validated) string to a number then use format specifiers. I've scoured:
https://developer.apple.com/library/content/documentation/Cocoa/Conceptual/Strings/Articles/formatSpecifiers.html
and
http://www.cplusplus.com/reference/cstdio/printf/
and others but can't figure out how to format a double so that it does not add a decimal when there are no digits after it, or any trailing zeros. For example:
x = 23.0
print (String(format: "%f", x))
//output is 23.000000
//I want 23
x = 23.45
print (String(format: "%f", x))
//output is 23.450000
//I want 23.45
On How to create a string with format?, I found this gem:
var str = "\(INT_VALUE) , \(FLOAT_VALUE) , \(DOUBLE_VALUE), \(STRING_VALUE)"
print(str)
It works perfectly for integers (why I said integers are easy above), but for doubles it appends a ".0" onto the first character the user enters. (It does work perfectly in Playground, but not my program (why???).
Will I have to resort to counting the number of digits before and after the decimal separator and inserting them into a format specifier? (And if so, how do I count those? I know how to create the format specifier.) Or is there a really simple way or a quick fix to use that one-liner above?
Thanks!

Turned out to be simple without using NumberFormatter (which I'm not so sure would really have accomplished what I want without a LOT more work).
let decimalSeparator = NSLocale.current.decimalSeparator! as String
var tempStr: String = textField.text
var i: Int = tempStr.count
//remove leading zeros for positive numbers (integer or real)
if i > 1 {
while (tempStr[0] == "0" && tempStr[1] != decimalSeparator[0] ) {
tempStr.remove(at: tempStr.startIndex)
i = i - 1
if i < 2 {
break
}
}
}
//remove leading zeros for negative numbers (integer or real)
if i > 2 {
while (tempStr[0] == "-" && tempStr[1] == "0") && tempStr[2] != decimalSeparator[0] {
tempStr.remove(at: tempStr.index(tempStr.startIndex, offsetBy: 1))
i = i - 1
if i < 3 {
break
}
}
}
Using the following extension to subscript the string:
extension String {
subscript (i: Int) -> Character {
return self[index(startIndex, offsetBy: i)]
}
}

Related

How to align numbers vertically? Swift, UIKit

I want to align numbers by digits in table rows. For example:
___123.4
-5 678.9
That is, so that tens are under tens, units under units, a fractional number under a fractional number.
To convert a Number to a String, I use the numberStringFormatter function.
func numberStringFormat(_ number: Double) -> String {
let numberFormatter = NumberFormatter()
numberFormatter.numberStyle = .decimal
numberFormatter.maximumFractionDigits = 1
numberFormatter.groupingSeparator = " "
let result = numberFormatter.string(from: NSNumber(value: number))
return result ?? ""
}
This function sets the fractional format, determines the number of decimal places, and groups the numbers before the decimal point by digits.
But if the number is an integer, without fractions, or a digit after a floating point after rounding it turns out to be 0, then the formatted string looks like this, for example, 123
And then these numbers in the rows of the table are shifted and it turns out like this:
----123
5 678.9
That is, the fractional number on the bottom row is under the integer number on the top row.
In my opinion, I can solve this task if I force the Number to always show 0 after converting to a String.
I tried googling but couldn't find an answer to this question.
Maybe someone has already encountered such situations and can suggest a possible solution, or at least in what direction to move?
Or, perhaps there is some other solution without forcing 0 to be shown, but simply aligning the characters vertically?
Any ideas are welcome. I really appreciate your help.
Update: A greate solution from HangarRash.
minimumFractionDigits = 1

Swift 5: String prefix with a maximum UTF-8 length

I have a string that can contain arbitrary Unicode characters and I want to get a prefix of that string whose UTF-8 encoded length is as close as possible to 32 bytes, while still being valid UTF-8 and without changing the characters' meaning (i.e. not cutting off an extended grapheme cluster).
Consider this CORRECT example:
let string = "\u{1F3F4}\u{E0067}\u{E0062}\u{E0073}\u{E0063}\u{E0074}\u{E007F}\u{1F1EA}\u{1F1FA}"
print(string) // 🏴󠁧󠁒󠁳󠁣󠁴󠁿πŸ‡ͺπŸ‡Ί
print(string.count) // 2
print(string.utf8.count) // 36
let prefix = string.utf8Prefix(32) // <-- function I want to implement
print(prefix) // 🏴󠁧󠁒󠁳󠁣󠁴󠁿
print(prefix.count) // 1
print(prefix.utf8.count) // 28
print(string.hasPrefix(prefix)) // true
And this example of a WRONG implementation:
let string = "ar\u{1F3F4}\u{200D}\u{2620}\u{FE0F}\u{1F3F4}\u{200D}\u{2620}\u{FE0F}\u{1F3F4}\u{200D}\u{2620}\u{FE0F}"
print(string) // arπŸ΄β€β˜ οΈπŸ΄β€β˜ οΈπŸ΄β€β˜ οΈ
print(string.count) // 5
print(string.utf8.count) // 41
let prefix = string.wrongUTF8Prefix(32) // <-- wrong implementation
print(prefix) // arπŸ΄β€β˜ οΈπŸ΄β€β˜ οΈπŸ΄
print(prefix.count) // 5
print(prefix.utf8.count) // 32
print(string.hasPrefix(prefix)) // false
What's an elegant way to do this? (besides trial&error)
You've shown no attempt at a solution and SO doesn't normally write code for you. So instead here as some algorithm suggestions for you:
What's an elegant way to do this? (besides trial&error)
By what definition of elegant? (like beauty it depends on the eye of the beholder...)
Simple?
Start with String.makeIterator, write a while loop, append Characters to your prefix as long as the byte count ≀ 32.
It's a very simple loop, worse case is 32 iterations and 32 appends.
"Smart" Search Strategy?
You could implement a strategy based on the average byte length of each Character in the String and using String.Prefix(Int).
E.g. for your first example the character count is 2 and the byte count 36, giving an average of 18 bytes/character, 18 goes into 32 just once (we don't deal in fractional characters or bytes!) so start with Prefix(1), which has a byte count of 28 and leaves 1 character and 8 bytes – so the remainder has an average byte length of 8 and you are seeking at most 4 more bytes, 8 goes into 4 zero times and you are done.
The above example shows the case of extending (or not) your prefix guess. If your prefix guess is too long you can just start your algorithm from scratch using the prefix character & byte counts rather than the original string's.
If you have trouble implementing your algorithm ask a new question showing the code you've written, describe the issue, and someone will undoubtedly help you with the next step.
HTH
I discovered that String and String.UTF8View share the same indices, so I managed to create a very simple (and efficient?) solution, I think:
extension String {
func utf8Prefix(_ maxLength: Int) -> Substring {
if self.utf8.count <= maxLength {
return Substring(self)
}
var index = self.utf8.index(self.startIndex, offsetBy: maxLength+1)
self.formIndex(before: &index)
return self.prefix(upTo: index)
}
}
Explanation (assuming maxLength == 32 and startIndex == 0):
The first case (utf8.count <= maxLength) should be clear, that's where no work is needed.
For the second case we first get the utf8-index 33, which is either
A: the endIndex of the string (if it's exactly 33 bytes long),
B: an index at the start of a character (after 33 bytes of previous characters)
C: an index somewhere in the middle of a character (after <33 bytes of previous characters)
So if we now move our index back one character (with formIndex(before:)) this will jump to the first extended grapheme cluster boundary before index which in case A and B is one character before and in C the start of that character.
I any case, the utf8-index will now be guaranteed to be at most 32 and at an extended grapheme cluster boundary, so prefix(upTo: index) will safely create a prefix with length ≀32.
…but it's not perfect.
In theory this should also be always the optimal solution, i.e. the prefix's count is as close as possible to maxLength but sometimes when the string ends with an extended grapheme cluster consisting of more than one Unicode scalar, formIndex(before: &index) goes back one character too many than would be necessary, so the prefix ends up shorter. I'm not exactly sure why that's the case.
EDIT: A not as elegant but in exchange completely "correct" solution would be this (still only O(n)):
extension String {
func utf8Prefix(_ maxLength: Int) -> Substring {
if self.utf8.count <= maxLength {
return Substring(self)
}
let endIndex = self.utf8.index(self.startIndex, offsetBy: maxLength)
var index = self.startIndex
while index <= endIndex {
self.formIndex(after: &index)
}
self.formIndex(before: &index)
return self.prefix(upTo: index)
}
}
I like the first solution you came up with. I've found it works more correctly (and simpler) if you take out the formIndex:
extension String {
func utf8Prefix(_ maxLength: Int) -> Substring {
if self.utf8.count <= maxLength {
return Substring(self)
}
let index = self.utf8.index(self.startIndex, offsetBy: maxLength)
return self.prefix(upTo: index)
}
}
My solution looks like this:
extension String {
func prefix(maxUTF8Length: Int) -> String {
if self.utf8.count <= maxUTF8Length { return self }
var utf8EndIndex = self.utf8.index(self.utf8.startIndex, offsetBy: maxUTF8Length)
while utf8EndIndex > self.utf8.startIndex {
if let stringIndex = utf8EndIndex.samePosition(in: self) {
return String(self[..<stringIndex])
} else {
self.utf8.formIndex(before: &utf8EndIndex)
}
}
return ""
}
}
It takes the highest possible utf8 index, checks if it is a valid character index using the Index.samePosition(in:) method. If not, it reduces the utf8 index one by one until it finds a valid character index.
The advantage is that you could replace utf8 with utf16 and it would also work.

Removing Digits from a Number

Does anybody know if there is a way of removing (trimming) the last two digits or first two digits from a number. I have the number 4131 and I want to separate that into 41 and 31, but after searching the Internet, I've only managed to find how to remove characters from a string, not a number. I tried converting my number to a string, then removing characters, and then converting it back to a number, but I keep receiving errors.
I believe I will be able to receive the first two digit by dividing the number by 100 and then rounding the number down, but I don't have an idea of how to get the last two digits?
Does anybody know the function to use to achieve what I'm trying to do, or can anybody point me in the right direction.
Try this:
var num = 1234
var first = num/100
var last = num%100
The playground's output is what you need.
You can use below methods to find the last two digits
func getLatTwoDigits(number : Int) -> Int {
return number%100; //4131%100 = 31
}
func getFirstTwoDigits(number : Int) -> Int {
return number/100; //4131/100 = 41
}
To find the first two digit you might need to change the logic on the basis of face value of number. Below method is generalise method to find each digit of a number.
func printDigits(number : Int) {
var num = number
while num > 0 {
var digit = num % 10 //get the last digit
println("\(digit)")
num = num / 10 //remove the last digit
}
}

Why does ("0"+"1").toInt()! return 1 as an Int, rather than 0

Was playing around with the reduce function on collections in Swift.
//Reduce method should return an Int with the value 3141.
let digits = ["3", "1", "4", "1"]
.reduce(0) {
(total:Int, digit:String) in
return ("\(total)" + digit).toInt()!
}
The function is giving the correct output but why does ("0"+"1").toInt()! return 1 as an Int, rather than 0? The string combined to be turned into an Int is "01". I assume this is a String that the function cannot covert to an Int directly. Does it just default to the second character instead?
"0"+"1" == "01". You're doing concatenation not addition. You lose the 0 when you convert to int because it's a leading zero.
Leading zero's are usually dropped as meaningless but in some contexts they actually signal that you're expressing an octal based number. Even if that's the case here it'd still end up evaluating to 1.

NSDecimalNumber and popping digits off of the end

I'm not quite sure what to call it, but I have a text field to hold a currency value, so I'm storing that as a NSDecimalNumber. I don't want to use the numbers & symbols keyboard so I'm using a number pad, and inferring the location of a decimal place like ATMs do. It works fine for entering numbers. Type 1234 and it displays $12.34 but now I need to implement back space. So assuming $12.34 is entered hitting back space would show $1.23. I'm not quite sure how to do this with a decimal number. With an int you would just divide by 10 to remove the right most digit, but that obviously doesn't work here. I could do it by some messy converting to int / 10 then back to decimal but that just sounds horrific... Any suggestions?
Call - (NSDecimalNumber *)decimalNumberByDividingBy:(NSDecimalNumber *)decimalNumber withBehavior:(id < NSDecimalNumberBehaviors >)behavior on it
How about using stringValue?
1) NSDecimalNumber to String
2) substring last
3) String to NSDecimalNumber
Below is an example for Swift 3
func popLastNumber(of number: NSDecimalNumber) -> NSDecimalNumber {
let stringFromNumber = number.stringValue //NSNumber property
let lastIndex = stringFromNumber.endIndex
let targetIndex = stringFromNumber.index(before: lastIndex)
let removed = stringFromNumber.substring(to: targetIndex)
return NSDecimalNumber(string: removed)
}
If your input number is a single digit, it would return NaN.
You could replace it to NSDecimalNumber.zero if you need.
It may works like delete button on calcultor.
It's not tested much.
If someone found another NaN case, please report by reply.