How to detect illegal input in Swift string when converting to Double - swift

Converting a string to a double in Swift is done as follows:
var string = "123.45"
string.bridgeToObjectiveC().doubleValue
If string is not a legal double value (ex. "a23e") the call to doubleValue will return 0.0. Since it is not returning nil how am I supposed to discriminate between the legal user input of 0.0 and an illegal input?

This is not a problem that is specific to Swift, With NSString in Objective-C, the doubleValue method returns 0.0 for invalid input (which is pretty horrible!). You are going to have to check the format of the string manually.
See the various options available to you here:
Check that a input to UITextField is numeric only

Here is my solution. Try to parse it and if it's not nil return the double value, otherwise handle it how you want.
var string = "2.1.2.1.2"
if let number = NSNumberFormatter().numberFromString(string) {
return number.doubleValue
} else {
...
}

Here is the solution I would use:
var string = NSString(string:"string")
var double = string.doubleValue
if double == 0.0 && string == "0.0"
{
//double is valid
}
else
{
//double is invalid
}
Firstly, I couldn't reassign the double value of the string to the original string variable, so I assigned it to a new one. If you can do this, assign the original string variable to a new variable before you convert it to a double, then use the new variable in place of string in the second if statement, and replace double with string the first statement.
The way this works is it converts the string to a double in a different variable, then it checks to see if the double equals 0.0 and if the original string equals 0.0 as a string. If they are both equal, the double is valid, as the user originally input 0.0, if not, the double is invalid. Put the code you want to use if the double is invalid in the else clause, such as an alert informing the user their input was invalid.

Related

Swift - Binary operator '>=' cannot be applied to operands of type 'String' and 'Int'

Not really understanding why this isn't working. I'm pretty new to the Swift world.
The error I'm getting is Binary operator '>=' cannot be applied to operands of type 'String' and 'Int'
Could anyone help me understand why I'm getting this error? Do I need to convert the String to a Double or is there something else I'm totally missing? Again I'm new to Swift.
Do I need to convert the String to a Double?
Yes, that's basically it.
You must declare first a variable to accumulate all the inputs:
var inputs = [Double]()
Observe that I'm declaring an array of Double because that's what we are interested in.
Then, each time you ask the input, convert the obtained String to Double and store it in your array:
print("Please enter a temperature\t", terminator: "")
var message : String = readLine()!
let value : Double = Double(message)!
inputs.append(value)
Finally, check all the accumulated values in inputs (you got this part right):
for value in inputs {
// value is already a Double
if value >= 80 {
message = "hot!"
}
// etc.
}
I suggest researching how to convert to Double with error checking (i.e. how to detect "100 hot!" and ignore it because can't be converted).
Also, consider using a loop to read the values.

Setting Text View to be Int rather than String? Swift

I have a text view for the user to input an Int.
I am having an issue with saving the result into an Int variable as it default thinks its a string.
What would be the best solution to force the output as an Int? I am using a numerical only keyboard so the user cannot enter strings
code:
#IBOutlet weak var userExerciseWeight: UITextField!
var userExerciseWeightSet = Int()
if let userExerciseWeightSet = Int(self.userExerciseWeight.text!) {
}
You can simply use if let construct for this. textView always will be string. all you need to do is convert textView text to Int.
if let intText = Int(self.textView.text) {
print(intText)
}
I am having an issue with saving the result into an Int variable as it default thinks its a string.
Text view is a view, not a model. What it displays is always a string. When the string happens to represent an Int, your code is responsible for converting Int to String when setting the initial value, and then for converting String back to Int when reading the value back from the view.
if let intVal = Int(textView.text) {
... intVal from text is valid
} else {
... text does not represent an int - e.g. it's empty
}
The same approach applies to displaying and reading values of other types: your program is responsible for formatting and parsing the data into an appropriate type.

How to create a single character String

I've reproduced this problem in a Swift playground but haven't solved it yet...
I'd like to print one of a range of characters in a UILabel. If I explicitly declare the character, it works:
// This works.
let value: String = "\u{f096}"
label.text = value // Displays the referenced character.
However, I want to construct the String. The code below appears to produce the same result as the line above, except that it doesn't. It just produces the String \u{f096} and not the character it references.
// This doesn't work
let n: Int = 0x95 + 1
print(String(n, radix: 16)) // Prints "96".
let value: String = "\\u{f0\(String(n, radix: 16))}"
label.text = value // Displays the String "\u{f096}".
I'm probably missing something simple. Any ideas?
How about stop using string conversion voodoo and use standard library type UnicodeScalar?
You can also create Unicode scalar values directly from their numeric representation.
let airplane = UnicodeScalar(9992)
print(airplane)
// Prints "✈︎"
UnicodeScalar.init there is actually returning optional value, so you must unwrap it.
If you need String just convert it via Character type to String.
let airplaneString: String = String(Character(airplane)) // Assuming that airplane here is unwrapped

Why converting Double to Int doesn't return optional Int in Swift?

Converting a String to Int returns an optional value but converting a Double to Int does not return an optional value. Why is that? I wanted to check if a double value is bigger than maximum Int value, but because converting function does not return an optional value, I am not be able to check by using optional binding.
var stringNumber: String = "555"
var intValue = Int(stringNumber) // returns optional(555)
var doubleNumber: Double = 555
var fromDoubleToInt = Int(doubleNumber) // returns 555
So if I try to convert a double number bigger than maximum Integer, it crashes instead of returning nil.
var doubleNumber: Double = 55555555555555555555
var fromDoubleToInt = Int(doubleNumber) // Crashes here
I know that there's another way to check if a double number is bigger than maximum Integer value, but I'm curious as why it's happening this way.
If we consider that for most doubles, a conversion to Int simply means dropping the decimal part:
let pieInt = Int(3.14159) // 3
Then the only case in which the Int(Double) constructor returns nil is in the case of an overflow.
With strings, converting to Int returns an optional, because generally, strings, such as "Hello world!" cannot be represented as an Int in a way that universally makes sense. So we return nil in the case that the string cannot be represented as an integer. This includes, by the way, values that can be perfectly represented as doubles or floats:
Consider:
let iPi = Int("3.14159")
let dPi = Double("3.14159")
In this case, iPi is nil while dPi is 3.14159. Why? Because "3.14159" doesn't have a valid Int representation.
But meanwhile, when we use the Int constructor which takes a Double and returns non-optional, we get a value.
So, if that constructor is changed to return an optional, why would it return 3 for 3.14159 instead of nil? 3.14159 can't be represented as an integer.
But if you want a method that returns an optional Int, returning nil when the Double would overflow, you can just write that method.
extension Double {
func toInt() -> Int? {
let minInt = Double(Int.min)
let maxInt = Double(Int.max)
guard case minInt ... maxInt = self else {
return nil
}
return Int(self)
}
}
let a = 3.14159.toInt() // returns 3
let b = 555555555555555555555.5.toInt() // returns nil
Failable initializers and methods with Optional return types are designed for scenarios where you, the programmer, can't know whether a parameter value will cause failure, or where verifying that an operation will succeed is equivalent to performing the operation:
let intFromString = Int(someString)
let valueFromDict = dict[someKey]
Parsing an integer from a string requires checking the string for numeric/non-numeric characters, so the check is the same as the work. Likewise, checking a dictionary for the existence of a key is the same as looking up the value for the key.
By contrast, certain operations are things where you, the programmer, need to verify upfront that your parameters or preconditions meet expectations:
let foo = someArray[index]
let bar = UInt32(someUInt64)
let baz: UInt = someUInt - anotherUInt
You can — and in most cases should — test at runtime whether index < someArray.count and someUInt64 < UInt32.max and someUInt > anotherUInt. These assumptions are fundamental to working with those kinds of types. On the one hand, you really want to design around them from the start. On the other, you don't want every bit of math you do to be peppered with Optional unwrapping — that's why we have types whose axioms are stated upfront.

Why does calling removeAtIndex: on a string invert the result if called within a string literal?

Could someone explain this discrepancy? If you call removeAtIndex:, for example, to remove the first character from a string, it works as expected if you apply the method outside of a string literal. Like this:
var user = "#pdxCorey"
user.removeAtIndex(user.startIndex)
print("user: \(user)")
// user: pdxCorey
However, if you call removeAtIndex: inside of a string literal, the result is the inverse:
var user = "#pdxCorey"
print("user: \(user.removeAtIndex(user.startIndex))")
// user: #
What's going on here?
This has nothing to do with whether you call the function inside of a
string literal or not. The String method
public mutating func removeAtIndex(i: Index) -> Character
removes the character at the given index from the string and
returns that character as the function result.
var user = "#pdxCorey"
let firstChar = user.removeAtIndex(user.startIndex)
print(user) // pdxCorey
print(firstChar) // #
In the first case you are printing the value of user after removing the
first character, that gives "pdxCorey".
In the second case you are printing the return value of removeAtIndex()
which is the removed character "#".