When I try the following:
var somestring = "5"
var somenumber = 2
var newnumber:Int = Int(somestring) + somenumber
I get this error:
binary operator '+' cannot be applied to two Int operands
What am I doing wrong? Shouldn't '+' be valid for adding two Ints?
That's a really weird error message. The actual problem is that you can't simply construct Ints from Strings. The proper way to do that is with the toInt method like so:
var newnumber:Int = something.toInt()! + somenumber
Notice that toInt returns an optional that's unwrapped with !. If you're not sure the string represents an integer, error handling needs to be added as well.
You should consider using the nil coalescing operator "??" to return 0 instead of nil when trying to extract the value from your string:
let someString = "5"
let someNumber = 2
let newNumber = (someString.toInt() ?? 0) + someNumber
println(newNumber) // 7
let anotherString = "a"
let anotherNumber = (anotherString.toInt() ?? 0) + someNumber
println(anotherNumber) // 2
update: Xcode 7.1.1 • Swift 2.1
let someString = "5"
let someNumber = 2
let newNumber = (Int(someString) ?? 0) + someNumber
print(newNumber) // 7
let anotherString = "a"
let anotherNumber = (Int(anotherString) ?? 0) + someNumber // 2
Related
In Swift3, I previously converted a Bool to an Int using the following method
let _ = Int(NSNumber(value: false))
After converting to Swift4, I'm getting a "'init' is deprecated" warning. How else should this be done?
With Swift 4.2 and Swift 5, you can choose one of the 5 following solutions in order to solve your problem.
#1. Using NSNumber's intValue property
import Foundation
let integer = NSNumber(value: false).intValue
print(integer) // prints 0
#2. Using type casting
import Foundation
let integer = NSNumber(value: false) as? Int
print(String(describing: integer)) // prints Optional(0)
#3. Using Int's init(exactly:) initializer
import Foundation
let integer = Int(exactly: NSNumber(value: false))
print(String(describing: integer)) // prints Optional(0)
As an alternative to the previous code, you can use the more concise code below.
import Foundation
let integer = Int(exactly: false)
print(String(describing: integer)) // prints Optional(0)
#4. Using Int's init(truncating:) initializer
import Foundation
let integer = Int(truncating: false)
print(integer) // prints 0
#5. Using control flow
Note that the following sample codes do not require to import Foundation.
Usage #1 (if statement):
let integer: Int
let bool = false
if bool {
integer = 1
} else {
integer = 0
}
print(integer) // prints 0
Usage #2 (ternary operator):
let integer = false ? 1 : 0
print(integer) // prints 0
You can use NSNumber property intValue:
let x = NSNumber(value: false).intValue
You can also use init?(exactly number: NSNumber) initializer:
let y = Int(exactly: NSNumber(value: false))
or as mentioned in comments by #Hamish the numeric initializer has been renamed to init(truncating:)
let z = Int(truncating: NSNumber(value: false))
or let Xcode implicitly create a NSNumber from it as mentioned by #MartinR
let z = Int(truncating: false)
Another option you have is to extend protocol BinaryInteger (Swift 4) or Integer (Swift3) and create your own non fallible initializer that takes a Bool as parameter and returns the original type using the ternary operator as suggested in comments by #vadian:
extension BinaryInteger {
init(_ bool: Bool) {
self = bool ? 1 : 0
}
}
let a = Int(true) // 1
let b = Int(false) // 0
let c = UInt8(true) // 1
let d = UInt8(false) // 0
My code does not work right now. I am trying to take names and add it by itself in the loop but the complier is giving me a error message and the code is not being printed.
let names = [Double(2),3,8] as [Any]
let count = names.count
for i in 0..<count {
print((names[i]) + names[i])
}
Because Any doesn't have + operator.
This will give you the result you expected.
If you want to add 2 values and print the result, you need to cast Any to calculatable like Double
let names = [Double(2),3,8] as [Any]
let count = names.count
for i in 0..<count {
if let value = names[i] as? Double {
print(value + value)
}
}
The use of as [Any] makes no sense. You can't add two objects of type Any which is probably what your error is about.
Simply drop it and your code works.
let names = [Double(2),3,8]
let count = names.count
for i in 0..<count {
print(names[i] + names[i])
}
Output:
4.0
6.0
16.0
Better yet:
let names = [Double(2),3,8]
for num in names {
print(num + num)
}
What is the correct way to set fullLength here?
var index = foo.index(foo.startIndex, offsetBy: SOME_NUM)
let length = Int(foo.substring(to: index))
if let fullLength = length? + SOME_NUM {
return
}
The problem here is length? is an Optional. I had this working with:
fullLength = length! + SOME_NUM, but I don't want to risk using ! in the event length is nil.
You could rewrite it as
if let length = length {
let fullLength = length + SOME_NUM
}
You can make use of nil-Coalescing operator.To help you with understanding nil-Coalescing operator let us take an example, which will solve your problem too.The nil-coalescing operator '??' in a ?? b unwraps an optional a if it contains a value or returns a default type b if a is nil. It is shorthand for
a != nil ? a! : b
Now coming to your question
var index = foo.index(foo.startIndex, offsetBy: SOME_NUM)
let length = Int(foo.substring(to: index))
let someDefaultValue = 0
let fullLength = length ?? someDefaultValue + SOME_NUM
// or to be precise , you can also
//replace someDefaultValue with 0 in above code
if let length = length
{
fullLength = length + SOME_NUM
}
I would do this:
let index = foo.index(foo.startIndex, offsetBy: SOME_NUM)
guard let length = Int(foo.substring(to: index)) else {
//show error
//return reasonable value for error
}
let fullLength = length + SOME_NUM
This way approaches the problem as length being a valid number (not nil) as a precondition of the execution of the rest of the code.
Hello I'm new to Swift and I'm building a calculator in Xcode. In my main storyboard I have a UIButton, UILabel and a UITextField that will get a number and by pressing the button, label's text should show the entered number + 5. In my app I need to convert a String variable to Int.
I tried the snippet below I didn't get any meaningful result.
var e = texti.text
let f: Int? = e.toInt()
let kashk = f * 2
label.text = "\(pashm)"
To make it clean and Swifty, I suggest this approach:
Swift 2 / 3
var string = "42" // here you would put your 'texti.text', assuming texti is for example UILabel
if let intVersion = Int(string) { // Swift 1.2: string.toInt()
let multiplied = 2 * intVersion
let multipliedString = "\(multiplied)"
// use the string as you wish, for example 'texti.text = multipliedString'
} else {
// handle the fact, that toInt() didn't yield an integer value
}
If you want to calculate with that new integer you have to unwrap it by putting an exclamation mark behind the variable name:
let stringnumber = "12"
let intnumber:Int? = Int(stringnumber)
print(intnumber!+3)
The result would be:
15
var string = "12"
var intVersion = string.toInt()
let intMultipied = intVersion! * 2
label.text= "\(intMultipied)"
Regarding how to convert a string to a integer:
var myString = "12" //Assign the value of your textfield
if let myInt = myString.toInt(){
//myInt is a integer with the value of "12"
} else {
//Do something, the text in the textfield is not a integer
}
The if let makes sure that your value can be casted to a integer.
.toInt() returns an optional Integer. If your string can be casted to a integer it will be, else it will return nil. The if let statement will only be casted if your string can be casted to a integer.
Since the new variable (constant to be exact) is a integer, you can make a new variable and add 5 to the value of your integer
var myString = "12" //Assign the value of your textfield
if let myInt = myString.toInt(){
//myInt is a integer with the value of “12”
let newInt = myInt + 5
myTextfield.text = "\(newInt)"
//The text of the textfield will be: "17" (12 + 5)
} else {
//Do something, the text in the textfield is not a integer
}
I get an error when declaring i
var users = Array<Dictionary<String,Any>>()
users.append(["Name":"user1","Age":20])
var i:Int = Int(users[0]["Age"])
How to get the int value?
var i = users[0]["Age"] as Int
As GoZoner points out, if you don't know that the downcast will succeed, use:
var i = users[0]["Age"] as? Int
The result will be nil if it fails
Swift 4 answer :
if let str = users[0]["Age"] as? String, let i = Int(str) {
// do what you want with i
}
If you are sure the result is an Int then use:
var i = users[0]["Age"] as! Int
but if you are unsure and want a nil value if it is not an Int then use:
var i = users[0]["Age"] as? Int
“Use the optional form of the type cast operator (as?) when you are
not sure if the downcast will succeed. This form of the operator will
always return an optional value, and the value will be nil if the
downcast was not possible. This enables you to check for a successful
downcast.”
Excerpt From: Apple Inc. “The Swift Programming Language.” iBooks.
https://itun.es/us/jEUH0.l
This may have worked previously, but it's not the answer for Swift 3. Just to clarify, I don't have the answer for Swift 3, below is my testing using the above answer, and clearly it doesn't work.
My data comes from an NSDictionary
print("subvalue[multi] = \(subvalue["multi"]!)")
print("as Int = \(subvalue["multi"]! as? Int)")
if let multiString = subvalue["multi"] as? String {
print("as String = \(multiString)")
print("as Int = \(Int(multiString)!)")
}
The output generated is:
subvalue[multi] = 1
as Int = nil
Just to spell it out:
a) The original value is of type Any? and the value is: 1
b) Casting to Int results in nil
c) Casting to String results in nil (the print lines never execute)
EDIT
The answer is to use NSNumber
let num = subvalue["multi"] as? NSNumber
Then we can convert the number to an integer
let myint = num.intValue
if let id = json["productID"] as? String {
self.productID = Int32(id, radix: 10)!
}
This worked for me. json["productID"] is of type Any.
If it can be cast to a string, then convert it to an Integer.