Why is Swift Decimal Returning Number from String Containing Letters? - swift

I am working with Swift's Decimal type, trying to ensure that an user-entered String is a valid Decimal.
I have two String values, each including a letter, within my Playground file. One of the values contains a letter at the start, while the other contains a letter at the end. I initialize a Decimal using each value, and only one Decimal initialization fails; the Decimal initialized with the value that contains the letter at the beginning.
Why does the Decimal initialized with a value that contains a letter at the end return a valid Decimal? I expect nil to be returned.
Attached is a screenshot from my Playground file.

It works this way because Decimal accepts any number values before the letters. The letters act as a terminator for any numbers that comes after it. So in your example:
12a = 12 ( a is the terminator in position 3 )
a12 = nil ( a is the terminator in position 1 )
If wanting both to be invalid if the string contains a letter then you could use Float instead.

Related

Converting a hex to string in Swift formatted to keep the same number of digits

I'm trying to create a string from hex values in an array, but whenever a hex in the array starts with a zero it disappears in the resulting string as well.
I use String(value:radix:uppercase) to create the string.
An example:
Here's an array: [0x13245678, 0x12345678, 0x12345678, 0x12345678].
Which gives me the string: 12345678123456781234567812345678 (32 characters)
But the following array: [0x02345678, 0x12345678, 0x02345678, 0x12345678] (notice that I replaced two 1's with zeroes).
Gives me the string: 234567812345678234567812345678 (30 characters)
I'm not sure why it removes the zeroes. I know the value is correct; how can I format it to keep the zero if it was there?
The number 0x01234567 is really just 0x1234567. Leading zeros in number literals don't mean anything (unless you are using the leading 0 for octal number literals).
Instead of using String(value:radix:uppercase), use String(format:).
let num = 0x1234567
let str = String(format: "%08X", num)
Explanation of the format:
The 0 means to pad the left end of the string with zeros as needed.
The 8 means you want the result to be 8 characters long
The X means you want the number converted to uppercase hex. Use x if you want lowercase hex.

Convert Character to Integer in Swift

I am creating an iPhone app and I need to convert a single digit number into an integer.
My code has a variable called char that has a type Character, but I need to be able to do math with it, therefore I think I need to convert it to a string, however I cannot find a way to do that.
In the latest Swift versions (at least in Swift 5) there is a more straighforward way of converting Character instances. Character has property wholeNumberValue which tries to convert a character to Int and returns nil if the character does not represent and integer.
let char: Character = "5"
if let intValue = char.wholeNumberValue {
print("Value is \(intValue)")
} else {
print("Not an integer")
}
With a Character you can create a String. And with a String you can create an Int.
let char: Character = "1"
if let number = Int(String(char)) {
// use number
}
The String middleman type conversion isn’t necessary if you use the unicodeScalars property of Swift 4.0’s Character type.
let myChar: Character = "3"
myChar.unicodeScalars.first!.value - Unicode.Scalar("0")!.value // 3: UInt32
This uses a trick commonly seen in C code of subtracting the value of the char ’0’ literal to convert from ascii values to decimal values. See this site for the conversions: https://www.asciitable.com
Also there are some implicit unwraps in my answer. To avoid those, you can validate that you have a decimal digit with CharacterSet.decimalDigits, and/or use guard lets around the first property. You can also subtract 48 directly rather than converting ”0” through Unicode.Scalar.

String to Integer (atoi) [Leetcode] gave wrong answer?

String to Integer (atoi)
This problem is implement atoi to convert a string to an integer.
When test input = " +0 123"
My code return = 123
But why expected answer = 0?
======================
And if test input = " +0123"
My code return = 123
Now expected answer = 123
So is that answer wrong?
I think this is expected result as it said
Requirements for atoi:
The function first discards as many whitespace characters as necessary until the first non-whitespace character is found. Then, starting from this character, takes an optional initial plus or minus sign followed by as many numerical digits as possible, and interprets them as a numerical value.
Your first test case has a space in between two different digit groups, and atoi only consider the first group which is '0' and convert into integer

Converting number in scientific notation to int

Could someone explain why I can not use int() to convert an integer number represented in string-scientific notation into a python int?
For example this does not work:
print int('1e1')
But this does:
print int(float('1e1'))
print int(1e1) # Works
Why does int not recognise the string as an integer? Surely its as simple as checking the sign of the exponent?
Behind the scenes a scientific number notation is always represented as a float internally. The reason is the varying number range as an integer only maps to a fixed value range, let's say 2^32 values. The scientific representation is similar to the floating representation with significant and exponent. Further details you can lookup in https://en.wikipedia.org/wiki/Floating_point.
You cannot cast a scientific number representation as string to integer directly.
print int(1e1) # Works
Works because 1e1 as a number is already a float.
>>> type(1e1)
<type 'float'>
Back to your question: We want to get an integer from float or scientific string. Details: https://docs.python.org/2/reference/lexical_analysis.html#integers
>>> int("13.37")
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ValueError: invalid literal for int() with base 10: '13.37'
For float or scientific representations you have to use the intermediate step over float.
Very Simple Solution
print(int(float(1e1)))
Steps:-
1- First you convert Scientific value to float.
2- Convert that float value to int .
3- Great you are able to get finally int data type.
Enjoy.
Because in Python (at least in 2.x since I do not use Python 3.x), int() behaves differently on strings and numeric values. If you input a string, then python will try to parse it to base 10 int
int ("077")
>> 77
But if you input a valid numeric value, then python will interpret it according to its base and type and convert it to base 10 int. then python will first interperet 077 as base 8 and convert it to base 10 then int() will jsut display it.
int (077) # Leading 0 defines a base 8 number.
>> 63
077
>> 63
So, int('1e1') will try to parse 1e1 as a base 10 string and will throw ValueError. But 1e1 is a numeric value (mathematical expression):
1e1
>> 10.0
So int will handle it as a numeric value and handle it as though, converting it to float(10.0) and then parse it to int. So Python will first interpret 1e1 since it was a numric value and evaluate 10.0 and int() will convert it to integer.
So calling int() with a string value, you must be sure that string is a valid base 10 integer value.
int(float(1e+001)) will work.
Whereas like what others had mention 1e1 is already a float.

Format String to truncate a number to a specific number of digits

Is there a format string to truncate a number to a specific number of digits?
For example, any number greater than 5 digits i would like to truncate to 3 digits.
132456 -> 132
5000000 -> 500
#Erik : Format specifiers like %2d are specific to a language? I actually want to use it in javascript
Pseudo-Code
Function returning a String, receiving a String representing a Number as a parameter
IF the String has more than 5 characters
RETURN a substring containing the first 3 characters.
ELSE
RETURN the string received as a parameter
END IF
END Function
I assume you refer to printf format strings. I couldn't find anything that will truncate an integer argument (i.e. %d). But you can specify the maximum length of a string by referring to a string format string and specifying lengths via "%<MinLength>.<MaxLength>s".
So in your case you could turn your number arguments into strings and then use "%3.3s".