How to cast from UInt16 to NSNumber - swift

I have a UInt16 variable that I would like to pass to a legacy function that requires an NSNumber.
If I try:
var castAsNSNumber : NSNumber = myUInt16
I get a compiler error 'UInt16' is not convertible to 'NSNumber'
Question
How can I recast this as an NSNumber?

var castAsNSNumber = NSNumber(unsignedShort: myUInt16)

Swift 4 or newer (still works in 5.5.1):
import Foundation
let u16 = UInt16(24)
let nsnum = u16 as NSNumber
let andBack = UInt16(truncating: nsnum)
print("\(u16), \(nsnum), \(andBack)")
Try yourself:
https://swiftfiddle.com/bido2kr4x5fqnlptznxtbra76e
Or instead of
let andBack = UInt16(truncating: nsnum)
you can also use
let andBack = nsnum.uint16Value

Related

How to convert string to UInt32?

I am a beginner in swift and I am having a problem with convering string to UInt32.
let Generator = (ReadableJSON ["People"] [Person]["F1"].string! as NSString).doubleValue
if Generator == 1 {
NameLabel1 = ReadableJSON ["People"] [Person]["A1"].string as String!
NameImeNaObekt = ReadableJSON ["People"] [Person] ["B1"].string as String!
Picture = ReadableJSON ["People"] [Person] ["E1"].string as String!
} else {
let RGen = arc4random_uniform ("\(Generator)") // here is the error
}
Would you advise me how to fix it. The problem is in the last line, which is red and it says Cannot convert value of type String to UInt32.
The main idea is that I am reading the number from a JSON file and I have to populate this number into the arc4random_uniform.
arc4random_uniform(UInt32)
accept an UInt32 value but you are passing an String value to it
this converts your number to string
"\(Generator)"
the last line should be like this
let RGen = arc4random_uniform (UInt32(Generator))
and if you want to 'RGen' is an String you can do it this way
"\(RGen)"
String(RGen)
var RGen= 0
let RGen =int( arc4random_uniform ("\(Generator)") )
or
let RGen =( arc4random_uniform ("(Generator)") ).toInt
Look here

Convert String to UnsafeMutablePointer<UInt8> Swift

Is there a way I could convert a String to an UnsafeMutablePointer<UInt8> in Swift language, avoiding data loss?
let s = "83"
var i: UInt8 = UInt8.init(s)!
let up: UnsafeMutablePointer<UInt8> = UnsafeMutablePointer<UInt8>.init(&i)

Swift: NSArray to Set?

I am trying to convert an NSArray to a Swift Set.
Not having much luck.
What is the proper way to do so?
For example if I have an NSArray of numbers:
#[1,2,3,4,5,6,7,8]
How do I create a Swift Set from that NSArray?
If you know for sure that the NSArray contains only number objects
then you can convert it to an Swift array of Int (or Double or
NSNumber, depending on your needs) and create a set from that:
let nsArray = NSArray(array: [1,2,3,4,5,6,7,8])
let set = Set(nsArray as! [Int])
If that is not guaranteed, use an optional cast:
if let set = (nsArray as? [Int]).map(Set.init) {
print(set)
} else {
// not an array of numbers
}
Another variant (motivated by #JAL's comments):
let set = Set(nsArray.flatMap { $0 as? Int })
// Swift 4.1 and later:
let set = Set(nsArray.compactMap { $0 as? Int })
This gives a set of all NSArray elements which are convertible
to Int, and silently ignores all other elements.
import Foundation
let nsarr: NSArray = NSArray(array: [1,2,3,4,5])
var set: Set<Int>
guard let arr = nsarr as? Array<Int> else {
exit(-1)
}
set = Set(arr)
print(set.dynamicType)
dump(set)
/*
Set<Int>
▿ 5 members
- [0]: 5
- [1]: 2
- [2]: 3
- [3]: 1
- [4]: 4
*/
with help of free bridging, it should be easy ...
If you know you're working with Ints, you could always iterate through your array and manually add each element to a Set<Int>:
let theArray = NSArray(array: [1,2,3,4,5,6,7,8])
var theSet = Set<Int>()
for number in (theArray as? [Int])! {
theSet.insert(number)
}
print(theSet) // "[2, 4, 5, 6, 7, 3, 1, 8]\n"
I'm trying to work out a more elegant solution with map, I'll update this answer as I make more progress.
Thanks to MartinR's suggestion to use unionInPlace (which takes in the SequenceType returned from map) instead of insert on the Set, this can be accomplished like so:
let theArray = NSArray(array: [1,2,3,4,5,6,7,8])
var mySet = Set<Int>()
mySet.unionInPlace(theArray.map { $0 as! Int })
Note that this may not be the safest solution due to the explicit cast to Int.
var array = [1,2,3,4,5]
var set = Set(array)

.toInt() removed in Swift 2?

I was working on an application that used a text field and translated it into an integer. Previously my code
textField.text.toInt()
worked. Now Swift declares this as an error and is telling me to do
textField.text!.toInt()
and it says there is no toInt() and to try using Int(). That doesn't work either. What just happened?
In Swift 2.x, the .toInt() function was removed from String. In replacement, Int now has an initializer that accepts a String
Int(myString)
In your case, you could use Int(textField.text!) insted of textField.text!.toInt()
Swift 1.x
let myString: String = "256"
let myInt: Int? = myString.toInt()
Swift 2.x, 3.x
let myString: String = "256"
let myInt: Int? = Int(myString)
Swift 2
let myString: NSString = "123"
let myStringToInt: Int = Int(myString.intValue)
declare your string as an object NSString
and use the intValue getter
Its easy enough to create your own extension method to put this back in:
extension String {
func toInt() -> Int? {
return Int(self)
}
}
I had the same issued in Payment processing apps.
Swift 1.0
let expMonth = UInt(expirationDate[0].toInt()!)
let expYear = UInt(expirationDate[1].toInt()!)
After in Swift 2.0
let expMonth = Int(expirationDate[0])
let expYear = Int(expirationDate[1])
That gave me some errors too!
This code solved my errors
let myString: String = dataEntered.text! // data entered in textField
var myInt: Int? = Int(myString) // conversion of string to Int
myInt = myInt! * 2 // manipulating the integer sum,difference,product, division
finalOutput.text! = "\(myInt)" // changes finalOutput label to myInt

How to convert Any to Int in Swift

I get an error when declaring i
var users = Array<Dictionary<String,Any>>()
users.append(["Name":"user1","Age":20])
var i:Int = Int(users[0]["Age"])
How to get the int value?
var i = users[0]["Age"] as Int
As GoZoner points out, if you don't know that the downcast will succeed, use:
var i = users[0]["Age"] as? Int
The result will be nil if it fails
Swift 4 answer :
if let str = users[0]["Age"] as? String, let i = Int(str) {
// do what you want with i
}
If you are sure the result is an Int then use:
var i = users[0]["Age"] as! Int
but if you are unsure and want a nil value if it is not an Int then use:
var i = users[0]["Age"] as? Int
“Use the optional form of the type cast operator (as?) when you are
not sure if the downcast will succeed. This form of the operator will
always return an optional value, and the value will be nil if the
downcast was not possible. This enables you to check for a successful
downcast.”
Excerpt From: Apple Inc. “The Swift Programming Language.” iBooks.
https://itun.es/us/jEUH0.l
This may have worked previously, but it's not the answer for Swift 3. Just to clarify, I don't have the answer for Swift 3, below is my testing using the above answer, and clearly it doesn't work.
My data comes from an NSDictionary
print("subvalue[multi] = \(subvalue["multi"]!)")
print("as Int = \(subvalue["multi"]! as? Int)")
if let multiString = subvalue["multi"] as? String {
print("as String = \(multiString)")
print("as Int = \(Int(multiString)!)")
}
The output generated is:
subvalue[multi] = 1
as Int = nil
Just to spell it out:
a) The original value is of type Any? and the value is: 1
b) Casting to Int results in nil
c) Casting to String results in nil (the print lines never execute)
EDIT
The answer is to use NSNumber
let num = subvalue["multi"] as? NSNumber
Then we can convert the number to an integer
let myint = num.intValue
if let id = json["productID"] as? String {
self.productID = Int32(id, radix: 10)!
}
This worked for me. json["productID"] is of type Any.
If it can be cast to a string, then convert it to an Integer.