It should be easy but I can only find the reverse conversion.
How can I convert Int32 to Int in Swift?
Unless the problem is different?
I have a value stored in Core Data and I want to return it as an Int.
Here is the code I am using, which does not work:
func myNumber () -> Int {
var myUnit:NSManagedObject
myUnit=self.getObject(“EntityName”) // This is working.
return Int(myUnit.valueForKey(“theNUMBER”)?.intValue!)
}
Am I missing something or isn't this ridiculously easy?
let number1: Int32 = 10
let number2 = Int(number1)
The error is your ? after valueForKey.
Int initializer doesnt accept optionals.
By doing myUnit.valueForKey(“theNUMBER”)?.intValue! gives you an optional value and the ! at the end doesnt help it.
Just replace with this:
return Int(myUnit.valueForKey(“theNUMBER”)!.intValue)
But you could also do like this if you want it to be fail safe:
return myUnit.valueForKey(“theNUMBER”)?.integerValue ?? 0
And to shorten you function you can do this:
func myNumber() -> Int {
let myUnit = self.getObject("EntityName") as! NSManagedObject
return myUnit.valueForKey("theNUMBER")?.integerValue ?? 0
}
Swift 4.0 producing "Cannot invoke initializer for type 'Int' with an argument list of type '(() -> Int32)"
let number1: Int32 = 10
let number2 = Int(number1)
Simply do this
Int("\(Int32value)")
I'm unable to understand why swift is making things difficult.
Sometimes "?" make things twisted ,
adding "!" to Int32 and then convert it to int works
let number1 = someInt32!
let number2 = Int(number1)
Related
I have following simple code after last line sum swift gives error
and var sum=input*2 should be changed to var sum=input!*2
I am not sure why since i didn't declare variable a as optional.
Does swift make input as optional ? Thanks
Var a="2"
let input = int(a)
var sum=input*2
Casting a string to an int returns and Optional because it could fail.
For example, let result = Int("foo") will return nil, because "foo" is not a valid Int.
What if you did
Var a = "This is most defiantly not a number and even if it were its too long to fit within an int 123457993849038409238490ff9f-09-0f9-09f dd0d0066646464646464349023849038490328 I'm a teapot".
let input = int(a)
Do you think that could be converted to an int? Now do you understand why its an optional?
What happens if the String can't be converted to an Int? Such as Int("A"). It becomes an optional because the compiler can't know for sure that the String you are passing in can become an Int.
guard let intVal = Int(a) else { return }
or
if let intVal = Int(a) {
//you have a valid Int here
}
is the way to go about handling this situation
I'm really stuck! I'm not an expert at ObjC, and now I am trying to use Swift. I thought it would be much simpler, but it wasn't. I remember Craig said that they call Swift “Objective-C without C”, but there are too many C types in OS X's foundation. Documents said that many ObjC types will automatically convert, possibly bidirectionally, to Swift types. I'm curious: how about C types?
Here's where I'm stuck:
//array1:[String?], event.KeyCode.value:Int16
let s = array1[event.keyCode.value]; //return Int16 is not convertible to Int
I tried some things in ObjC:
let index = (Int) event.keyCode.value; //Error
or
let index = (Int32) event.keyCode.value; //Error again, Swift seems doesn't support this syntax
What is the proper way to convert Int16 to Int?
To convert a number from one type to another, you have to create a new instance, passing the source value as parameter - for example:
let int16: Int16 = 20
let int: Int = Int(int16)
let int32: Int32 = Int32(int16)
I used explicit types for variable declarations, to make the concept clear - but in all the above cases the type can be inferred:
let int16: Int16 = 20
let int = Int(int16)
let int32 = Int32(int16)
This is not how this type of casting works in Swift. Instead, use:
let a : Int16 = 1
let b : Int = Int(a)
So, you basically instantiate one variable on base of the other.
In my case "I was trying to convert core data Int16 to Int" explicit casting didn't worked. I did this way
let quantity_string:String = String(format:"%d",yourInt16value)
let quantity:Int = Int(quantity_string)
this is somewhat related to this question: How to properly store timestamp (ms since 1970)
Is there a way to typeCast a AnyObject to Int64? I am receiving a huge number via JSON this number arrives at my class as "AnyObject" - how can I cast it to Int64 - xcode just says its not possible.
JSON numbers are NSNumber, so you'll want to go through there.
import Foundation
var json:AnyObject = NSNumber(longLong: 1234567890123456789)
var num = json as? NSNumber
var result = num?.longLongValue
Note that result is Int64?, since you don't know that this conversion will be successful.
You can cast from a AnyObject to an Int with the as type cast operator, but to downcast into different numeric types you need to use the target type's initializer.
var o:AnyObject = 1
var n:Int = o as Int
var u:Int64 = Int64(n)
Try SwiftJSON which is a better way to deal with JSON data in Swift
let json = SwiftJSON.JSON(data: dataFromServer)
if let number = json["number"].longLong {
//do what you want
} else {
//print the error message if you like
println(json["number"])
}
As #Rob Napier's answer says, you are dealing with NSNumber. If you're sure you have a valid one, you can do this to get an Int64
(json["key"] as! NSNumber).longLongValue
I have a Dictionary with a String and AnyObject, so [String: AnyObject].
In a function I want to check the type of the dict value. So this code worked in Xcode6-Beta 3:
for (key, value: AnyObject) in contents {
...
} else if value is Float {
stringValue = String(value as Float) + ","
Now I get the error: AnyObject is not convertible to Float
stringValue = String(Float(value)) + "," doesn't work as well.
Any ideas?
There is no problem with casting AnyObject to Float.
Converting AnyObject to Float has no problem as you can see below instruction will execute without errors.
var f:Float = value as Float
As swift String has no intializer to convert for Float
If you do
var str:String = String(f) //This will show error as swift has no intializer for Float
But Swift has only added intializer to String for Int to convert directly.There is not intializer for Float.
var i:Int = value as Int
var str:String = String(i) //this will run fine
Now to solve your problem you can do
for (key, value: AnyObject) in contents {
if(value is Int){
}else if value is Float {
//Take this in var and use
var stringValue = "\(value as Float),"
}
}
In future swift may add the intializer for Float but currently there is no intializer.
Replace "as" with "as!" to force downcast.
Please, remember that you can use the forced form of the type cast operator (as!) only when you are sure that the downcast will always succeed, otherwise a runtime error will be triggered. In your particular case there will be no problem since there is a previous checking (if value is Float).
You can't cast to float because:
AnyObject can represent an instance of any class type.
From Swift Programming Guide
But float isn't a class. You will have to use Any instead:
Any can represent an instance of any type at all, apart from function types.
From Swift Programming Guide
This should work (but I can't test on my current Mac):
for (key, value: Any) in contents {
...
} else if value is Float {
stringValue = String(value as Float) + ","
As wumm said, you can't directly cast to Float because Float is not a class type. You may be able to cast it to an NSNumber and bridge to a Float though. Try
value as NSNumber as Float
This strategy works with casting AnyObject into a Swift String.
Using your code and fixing the concatenation it would look like:
for (key, value: AnyObject) in contents {
if value is Float {
stringValue = "\(stringValue) \(value as NSNumber as Float) ,"
}
}
I was working on an application that used a text field and translated it into an integer. Previously my code
textField.text.toInt()
worked. Now Swift declares this as an error and is telling me to do
textField.text!.toInt()
and it says there is no toInt() and to try using Int(). That doesn't work either. What just happened?
In Swift 2.x, the .toInt() function was removed from String. In replacement, Int now has an initializer that accepts a String
Int(myString)
In your case, you could use Int(textField.text!) insted of textField.text!.toInt()
Swift 1.x
let myString: String = "256"
let myInt: Int? = myString.toInt()
Swift 2.x, 3.x
let myString: String = "256"
let myInt: Int? = Int(myString)
Swift 2
let myString: NSString = "123"
let myStringToInt: Int = Int(myString.intValue)
declare your string as an object NSString
and use the intValue getter
Its easy enough to create your own extension method to put this back in:
extension String {
func toInt() -> Int? {
return Int(self)
}
}
I had the same issued in Payment processing apps.
Swift 1.0
let expMonth = UInt(expirationDate[0].toInt()!)
let expYear = UInt(expirationDate[1].toInt()!)
After in Swift 2.0
let expMonth = Int(expirationDate[0])
let expYear = Int(expirationDate[1])
That gave me some errors too!
This code solved my errors
let myString: String = dataEntered.text! // data entered in textField
var myInt: Int? = Int(myString) // conversion of string to Int
myInt = myInt! * 2 // manipulating the integer sum,difference,product, division
finalOutput.text! = "\(myInt)" // changes finalOutput label to myInt