I'm really stuck! I'm not an expert at ObjC, and now I am trying to use Swift. I thought it would be much simpler, but it wasn't. I remember Craig said that they call Swift “Objective-C without C”, but there are too many C types in OS X's foundation. Documents said that many ObjC types will automatically convert, possibly bidirectionally, to Swift types. I'm curious: how about C types?
Here's where I'm stuck:
//array1:[String?], event.KeyCode.value:Int16
let s = array1[event.keyCode.value]; //return Int16 is not convertible to Int
I tried some things in ObjC:
let index = (Int) event.keyCode.value; //Error
or
let index = (Int32) event.keyCode.value; //Error again, Swift seems doesn't support this syntax
What is the proper way to convert Int16 to Int?
To convert a number from one type to another, you have to create a new instance, passing the source value as parameter - for example:
let int16: Int16 = 20
let int: Int = Int(int16)
let int32: Int32 = Int32(int16)
I used explicit types for variable declarations, to make the concept clear - but in all the above cases the type can be inferred:
let int16: Int16 = 20
let int = Int(int16)
let int32 = Int32(int16)
This is not how this type of casting works in Swift. Instead, use:
let a : Int16 = 1
let b : Int = Int(a)
So, you basically instantiate one variable on base of the other.
In my case "I was trying to convert core data Int16 to Int" explicit casting didn't worked. I did this way
let quantity_string:String = String(format:"%d",yourInt16value)
let quantity:Int = Int(quantity_string)
Related
I searched for the answer past days, and lot of them are very old (around swift 2 and 1.2).
I wanted to get characters when unicode code is taken from the variable. Because for some unkown reason that consturction won't work in Swift:
print("\u{\(variable)}") // should be proposal for including this in Swift 6
people advice to use UnicodeScalars. However Apple must introduced something new in Swift 5. I found some tutorial here but this code fragment
let value0: UInt8 = 0x61
let value1: UInt16 = 0x5927
let value2: UInt32 = 0x1F34E
let string0 = String(UnicodeScalar(value0)) // a
let string1 = String(UnicodeScalar(value1)) // 大 error here
let string2 = String(UnicodeScalar(value2)) // 🍎 error here
is not working, with string1 and string2 I get error "no exact matches in call to initalizer". So as when author posted that I understand it must worked in previous version of Swift but with latest they do not. What is changed under the hood? Section for Strings in Apple handbook does not reveal anything.
I am trying to rewrite in Swift some code from Typescript and in JS is so simple like:
for (let i = str.length; i >= 1; i -= 2) {
r = String.fromCharCode(parseInt("0x" + str.substring(i - 2, i))) + r;
}
and I'm struggling with this for past 2 days without effect!
The UnicodeScalar initializers taking an UInt16 or UInt32 argument are failable initializers (and return nil if the passed argument is an invalid Unicode scalar value).
public init?(_ v: UInt16)
public init?(_ v: UInt32)
The optional must be unwrapped before passing it to the String initializer.
Only the initializer taking an UInt8 argument is non-failable:
public init(_ v: UInt8)
So this compiles and produces the expected result:
let value0: UInt8 = 0x61
let value1: UInt16 = 0x5927
let value2: UInt32 = 0x1F34E
let string0 = String(UnicodeScalar(value0)) // a
let string1 = String(UnicodeScalar(value1)!) // 大
// ^--- unwrap optional
let string2 = String(UnicodeScalar(value2)!) // 🍎
// ^--- unwrap optional
Of course, in your real code, you would use optional binding and not forced unwrapping.
I got from a function Swift result in type UnsafeMutablePointer<UInt>
Can I cast it to UInt?
Just use the memory property to access the underlying data.
let ptr: UnsafeMutablePointer<UInt> = funcReturningMutablePtr()
let theValue: UInt = ptr.memory
The type annotations are for clarity, but are not necessary.
Say I have
var dict = parseJSON(getJSON(url)) // This results in an NSDictionary
Why is
let a = dict["list"]![1]! as NSDictionary
let b = a["temp"]!["min"]! as Float
allowed, and this:
let b = dict["list"]![1]!["temp"]!["min"]! as Float
results in an error:
Type 'String' does not conform to protocol 'NSCopying'
Please explain why this happens, note that I'm new to Swift and have no experience.
dict["list"]![1]! returns an object that is not known yet (AnyObject) and without the proper cast the compiler cannot know that the returned object is a dictionary
In your first example you properly cast the returned value to a dictionary and only then you can extract the value you expect.
To amend the answer from #giorashc: use explicit casting like
let b = (dict["list"]![1]! as NSDictionary)["temp"]!["min"]! as Float
But splitting it is better readable in those cases.
It should be easy but I can only find the reverse conversion.
How can I convert Int32 to Int in Swift?
Unless the problem is different?
I have a value stored in Core Data and I want to return it as an Int.
Here is the code I am using, which does not work:
func myNumber () -> Int {
var myUnit:NSManagedObject
myUnit=self.getObject(“EntityName”) // This is working.
return Int(myUnit.valueForKey(“theNUMBER”)?.intValue!)
}
Am I missing something or isn't this ridiculously easy?
let number1: Int32 = 10
let number2 = Int(number1)
The error is your ? after valueForKey.
Int initializer doesnt accept optionals.
By doing myUnit.valueForKey(“theNUMBER”)?.intValue! gives you an optional value and the ! at the end doesnt help it.
Just replace with this:
return Int(myUnit.valueForKey(“theNUMBER”)!.intValue)
But you could also do like this if you want it to be fail safe:
return myUnit.valueForKey(“theNUMBER”)?.integerValue ?? 0
And to shorten you function you can do this:
func myNumber() -> Int {
let myUnit = self.getObject("EntityName") as! NSManagedObject
return myUnit.valueForKey("theNUMBER")?.integerValue ?? 0
}
Swift 4.0 producing "Cannot invoke initializer for type 'Int' with an argument list of type '(() -> Int32)"
let number1: Int32 = 10
let number2 = Int(number1)
Simply do this
Int("\(Int32value)")
I'm unable to understand why swift is making things difficult.
Sometimes "?" make things twisted ,
adding "!" to Int32 and then convert it to int works
let number1 = someInt32!
let number2 = Int(number1)
this is somewhat related to this question: How to properly store timestamp (ms since 1970)
Is there a way to typeCast a AnyObject to Int64? I am receiving a huge number via JSON this number arrives at my class as "AnyObject" - how can I cast it to Int64 - xcode just says its not possible.
JSON numbers are NSNumber, so you'll want to go through there.
import Foundation
var json:AnyObject = NSNumber(longLong: 1234567890123456789)
var num = json as? NSNumber
var result = num?.longLongValue
Note that result is Int64?, since you don't know that this conversion will be successful.
You can cast from a AnyObject to an Int with the as type cast operator, but to downcast into different numeric types you need to use the target type's initializer.
var o:AnyObject = 1
var n:Int = o as Int
var u:Int64 = Int64(n)
Try SwiftJSON which is a better way to deal with JSON data in Swift
let json = SwiftJSON.JSON(data: dataFromServer)
if let number = json["number"].longLong {
//do what you want
} else {
//print the error message if you like
println(json["number"])
}
As #Rob Napier's answer says, you are dealing with NSNumber. If you're sure you have a valid one, you can do this to get an Int64
(json["key"] as! NSNumber).longLongValue