In Swift3, I previously converted a Bool to an Int using the following method
let _ = Int(NSNumber(value: false))
After converting to Swift4, I'm getting a "'init' is deprecated" warning. How else should this be done?
With Swift 4.2 and Swift 5, you can choose one of the 5 following solutions in order to solve your problem.
#1. Using NSNumber's intValue property
import Foundation
let integer = NSNumber(value: false).intValue
print(integer) // prints 0
#2. Using type casting
import Foundation
let integer = NSNumber(value: false) as? Int
print(String(describing: integer)) // prints Optional(0)
#3. Using Int's init(exactly:) initializer
import Foundation
let integer = Int(exactly: NSNumber(value: false))
print(String(describing: integer)) // prints Optional(0)
As an alternative to the previous code, you can use the more concise code below.
import Foundation
let integer = Int(exactly: false)
print(String(describing: integer)) // prints Optional(0)
#4. Using Int's init(truncating:) initializer
import Foundation
let integer = Int(truncating: false)
print(integer) // prints 0
#5. Using control flow
Note that the following sample codes do not require to import Foundation.
Usage #1 (if statement):
let integer: Int
let bool = false
if bool {
integer = 1
} else {
integer = 0
}
print(integer) // prints 0
Usage #2 (ternary operator):
let integer = false ? 1 : 0
print(integer) // prints 0
You can use NSNumber property intValue:
let x = NSNumber(value: false).intValue
You can also use init?(exactly number: NSNumber) initializer:
let y = Int(exactly: NSNumber(value: false))
or as mentioned in comments by #Hamish the numeric initializer has been renamed to init(truncating:)
let z = Int(truncating: NSNumber(value: false))
or let Xcode implicitly create a NSNumber from it as mentioned by #MartinR
let z = Int(truncating: false)
Another option you have is to extend protocol BinaryInteger (Swift 4) or Integer (Swift3) and create your own non fallible initializer that takes a Bool as parameter and returns the original type using the ternary operator as suggested in comments by #vadian:
extension BinaryInteger {
init(_ bool: Bool) {
self = bool ? 1 : 0
}
}
let a = Int(true) // 1
let b = Int(false) // 0
let c = UInt8(true) // 1
let d = UInt8(false) // 0
Related
This is a super basic question, but, I can't seem to find an answer in Swift.
Question:
How do I get the whole integer part and fractional part (to the left and right of the decimal point respectively) of a number in Swift 2 and Swift 3? For example, for the number 1234.56789 —
How do I get the integer part 1234.56789 ?
How do I get the fractional part 1234.56789 ?
You could do simple floor and truncating:
let value = 1234.56789
let double = floor(value) // 1234.0
let integer = Int(double) // 1234
let decimal = value.truncatingRemainder(dividingBy: 1) // 0.56789
No need for extensions. Swift already has built in function for this.
let aNumber = modf(3.12345)
aNumber.0 // 3.0
aNumber.1 // 0.12345
Swift 4.xm, 5.x complete solution:
I credit #Thomas solution also I'd like to add few things which allow us to use separated parts in 2 string part.
Especially when we want to use 2 different UILabel for main and decimal part of the number.
Extension below is also managing number quantity of decimal part. I thought it might be useful.
UPDATE: Now, it is working perfectly with negative numbers as well!
public extension Double{
func integerPart()->String{
let result = floor(abs(self)).description.dropLast(2).description
let plusMinus = self < 0 ? "-" : ""
return plusMinus + result
}
func fractionalPart(_ withDecimalQty:Int = 2)->String{
let valDecimal = self.truncatingRemainder(dividingBy: 1)
let formatted = String(format: "%.\(withDecimalQty)f", valDecimal)
let dropQuantity = self < 0 ? 3:2
return formatted.dropFirst(dropQuantity).description
}
Convert your number into String later separate string from .
Try this:-
let number:Float = 123.46789
let numberString = String(number)
let numberComponent = numberString.components(separatedBy :".")
let integerNumber = Int(numberComponent [0])
let fractionalNumber = Int(numberComponent [1])
You could do this ->
let x:Double = 1234.5678
let y:Double = Double(Int(x))
let z:Double = x - Double(Int(x))
print("\(x) \(y) \(z)")
Where x is your original value. y is the integer part and z is the fractional part.
Edit
Thomas answer is the one you want ...
This will definitely work for you in swift 5 and 4
let number = 3.145443
let integerValue = String(format: "%.0f", number)
let integerValue1 = String(format: "%.1f", number)
let integerValue2 = String(format: "%.2f", number)
print(integerValue)
print(integerValue1)
print(integerValue2)
//Output
3
3.1
3.14
modf() works bad with numbers > 1 billion
#Trevor behaves like modf()
#Andres Cedronius example behaves like modf() too (i tried modify it)
#Irshad Ahmed solution is nice
there is some convenience convertions
For some reason you need more control, check out https://developer.apple.com/documentation/foundation/numberformatter
extension Double {
/// 1.00234 -> 1.0
var integerPart: Double {
return Double(Int(self))
}
/// 1.0012 --> 0.0012
var fractionPart: Double {
let fractionStr = "0.\(String(self).split(separator: ".")[1])"
return Double(fractionStr)!
}
/// 1.0012 --> "0.0012"
var fractionPartString: String {
return "0.\(String(self).split(separator: ".")[1])"
}
/// 1.0012 --> 12
var fractionPartInteger: Int {
let fractionStr = "\(String(self).split(separator: ".")[1])"
return Int(fractionStr)!
}
}
print(1000.0.integerPart) // 1000.0
print(1000.0.fractionPart) // 0.0
print(1000.1.integerPart) // 1000.0
print(1000.2.fractionPart) // 0.2
print(1_000_000_000.1.integerPart) // 1000000000.0
print(100_000_000.13233.fractionPart) // 0.13233
var specimen0:Double = 1234.56789
var significant0 = Double.IntegerLiteralType(specimen0) // result is an integer
var fractionals0 = specimen0 - Double(significant0)
var specimen1:Float = -1234.56789
var significant1 = Float.IntegerLiteralType(specimen1) // result is an integer
var fractionals1 = specimen1 - Float(significant1)
var specimen2:CGFloat = -1234.56789
var significant2 = CGFloat.IntegerLiteralType(specimen2) // result is an integer
var fractionals2 = specimen2 - CGFloat(significant2)
These are all built in as of Swift 5.3, I am sure even earlier...
I want to convert the index of a letter contained within a string to an integer value. Attempted to read the header files but I cannot find the type for Index, although it appears to conform to protocol ForwardIndexType with methods (e.g. distanceTo).
var letters = "abcdefg"
let index = letters.characters.indexOf("c")!
// ERROR: Cannot invoke initializer for type 'Int' with an argument list of type '(String.CharacterView.Index)'
let intValue = Int(index) // I want the integer value of the index (e.g. 2)
Any help is appreciated.
edit/update:
Xcode 11 • Swift 5.1 or later
extension StringProtocol {
func distance(of element: Element) -> Int? { firstIndex(of: element)?.distance(in: self) }
func distance<S: StringProtocol>(of string: S) -> Int? { range(of: string)?.lowerBound.distance(in: self) }
}
extension Collection {
func distance(to index: Index) -> Int { distance(from: startIndex, to: index) }
}
extension String.Index {
func distance<S: StringProtocol>(in string: S) -> Int { string.distance(to: self) }
}
Playground testing
let letters = "abcdefg"
let char: Character = "c"
if let distance = letters.distance(of: char) {
print("character \(char) was found at position #\(distance)") // "character c was found at position #2\n"
} else {
print("character \(char) was not found")
}
let string = "cde"
if let distance = letters.distance(of: string) {
print("string \(string) was found at position #\(distance)") // "string cde was found at position #2\n"
} else {
print("string \(string) was not found")
}
Works for Xcode 13 and Swift 5
let myString = "Hello World"
if let i = myString.firstIndex(of: "o") {
let index: Int = myString.distance(from: myString.startIndex, to: i)
print(index) // Prints 4
}
The function func distance(from start: String.Index, to end: String.Index) -> String.IndexDistance returns an IndexDistance which is just a typealias for Int
Swift 4
var str = "abcdefg"
let index = str.index(of: "c")?.encodedOffset // Result: 2
Note: If String contains same multiple characters, it will just get the nearest one from left
var str = "abcdefgc"
let index = str.index(of: "c")?.encodedOffset // Result: 2
encodedOffset has deprecated from Swift 4.2.
Deprecation message:
encodedOffset has been deprecated as most common usage is incorrect. Use utf16Offset(in:) to achieve the same behavior.
So we can use utf16Offset(in:) like this:
var str = "abcdefgc"
let index = str.index(of: "c")?.utf16Offset(in: str) // Result: 2
When searching for index like this
⛔️ guard let index = (positions.firstIndex { position <= $0 }) else {
it is treated as Array.Index. You have to give compiler a clue you want an integer
✅ guard let index: Int = (positions.firstIndex { position <= $0 }) else {
Swift 5
You can do convert to array of characters and then use advanced(by:) to convert to integer.
let myString = "Hello World"
if let i = Array(myString).firstIndex(of: "o") {
let index: Int = i.advanced(by: 0)
print(index) // Prints 4
}
To perform string operation based on index , you can not do it with traditional index numeric approach. because swift.index is retrieved by the indices function and it is not in the Int type. Even though String is an array of characters, still we can't read element by index.
This is frustrating.
So ,to create new substring of every even character of string , check below code.
let mystr = "abcdefghijklmnopqrstuvwxyz"
let mystrArray = Array(mystr)
let strLength = mystrArray.count
var resultStrArray : [Character] = []
var i = 0
while i < strLength {
if i % 2 == 0 {
resultStrArray.append(mystrArray[i])
}
i += 1
}
let resultString = String(resultStrArray)
print(resultString)
Output : acegikmoqsuwy
Thanks In advance
Here is an extension that will let you access the bounds of a substring as Ints instead of String.Index values:
import Foundation
/// This extension is available at
/// https://gist.github.com/zackdotcomputer/9d83f4d48af7127cd0bea427b4d6d61b
extension StringProtocol {
/// Access the range of the search string as integer indices
/// in the rendered string.
/// - NOTE: This is "unsafe" because it may not return what you expect if
/// your string contains single symbols formed from multiple scalars.
/// - Returns: A `CountableRange<Int>` that will align with the Swift String.Index
/// from the result of the standard function range(of:).
func countableRange<SearchType: StringProtocol>(
of search: SearchType,
options: String.CompareOptions = [],
range: Range<String.Index>? = nil,
locale: Locale? = nil
) -> CountableRange<Int>? {
guard let trueRange = self.range(of: search, options: options, range: range, locale: locale) else {
return nil
}
let intStart = self.distance(from: startIndex, to: trueRange.lowerBound)
let intEnd = self.distance(from: trueRange.lowerBound, to: trueRange.upperBound) + intStart
return Range(uncheckedBounds: (lower: intStart, upper: intEnd))
}
}
Just be aware that this can lead to weirdness, which is why Apple has chosen to make it hard. (Though that's a debatable design decision - hiding a dangerous thing by just making it hard...)
You can read more in the String documentation from Apple, but the tldr is that it stems from the fact that these "indices" are actually implementation-specific. They represent the indices into the string after it has been rendered by the OS, and so can shift from OS-to-OS depending on what version of the Unicode spec is being used. This means that accessing values by index is no longer a constant-time operation, because the UTF spec has to be run over the data to determine the right place in the string. These indices will also not line up with the values generated by NSString, if you bridge to it, or with the indices into the underlying UTF scalars. Caveat developer.
In case you got an "index is out of bounds" error. You may try this approach. Working in Swift 5
extension String{
func countIndex(_ char:Character) -> Int{
var count = 0
var temp = self
for c in self{
if c == char {
//temp.remove(at: temp.index(temp.startIndex,offsetBy:count))
//temp.insert(".", at: temp.index(temp.startIndex,offsetBy: count))
return count
}
count += 1
}
return -1
}
}
When I try the following:
var somestring = "5"
var somenumber = 2
var newnumber:Int = Int(somestring) + somenumber
I get this error:
binary operator '+' cannot be applied to two Int operands
What am I doing wrong? Shouldn't '+' be valid for adding two Ints?
That's a really weird error message. The actual problem is that you can't simply construct Ints from Strings. The proper way to do that is with the toInt method like so:
var newnumber:Int = something.toInt()! + somenumber
Notice that toInt returns an optional that's unwrapped with !. If you're not sure the string represents an integer, error handling needs to be added as well.
You should consider using the nil coalescing operator "??" to return 0 instead of nil when trying to extract the value from your string:
let someString = "5"
let someNumber = 2
let newNumber = (someString.toInt() ?? 0) + someNumber
println(newNumber) // 7
let anotherString = "a"
let anotherNumber = (anotherString.toInt() ?? 0) + someNumber
println(anotherNumber) // 2
update: Xcode 7.1.1 • Swift 2.1
let someString = "5"
let someNumber = 2
let newNumber = (Int(someString) ?? 0) + someNumber
print(newNumber) // 7
let anotherString = "a"
let anotherNumber = (Int(anotherString) ?? 0) + someNumber // 2
I get an error when declaring i
var users = Array<Dictionary<String,Any>>()
users.append(["Name":"user1","Age":20])
var i:Int = Int(users[0]["Age"])
How to get the int value?
var i = users[0]["Age"] as Int
As GoZoner points out, if you don't know that the downcast will succeed, use:
var i = users[0]["Age"] as? Int
The result will be nil if it fails
Swift 4 answer :
if let str = users[0]["Age"] as? String, let i = Int(str) {
// do what you want with i
}
If you are sure the result is an Int then use:
var i = users[0]["Age"] as! Int
but if you are unsure and want a nil value if it is not an Int then use:
var i = users[0]["Age"] as? Int
“Use the optional form of the type cast operator (as?) when you are
not sure if the downcast will succeed. This form of the operator will
always return an optional value, and the value will be nil if the
downcast was not possible. This enables you to check for a successful
downcast.”
Excerpt From: Apple Inc. “The Swift Programming Language.” iBooks.
https://itun.es/us/jEUH0.l
This may have worked previously, but it's not the answer for Swift 3. Just to clarify, I don't have the answer for Swift 3, below is my testing using the above answer, and clearly it doesn't work.
My data comes from an NSDictionary
print("subvalue[multi] = \(subvalue["multi"]!)")
print("as Int = \(subvalue["multi"]! as? Int)")
if let multiString = subvalue["multi"] as? String {
print("as String = \(multiString)")
print("as Int = \(Int(multiString)!)")
}
The output generated is:
subvalue[multi] = 1
as Int = nil
Just to spell it out:
a) The original value is of type Any? and the value is: 1
b) Casting to Int results in nil
c) Casting to String results in nil (the print lines never execute)
EDIT
The answer is to use NSNumber
let num = subvalue["multi"] as? NSNumber
Then we can convert the number to an integer
let myint = num.intValue
if let id = json["productID"] as? String {
self.productID = Int32(id, radix: 10)!
}
This worked for me. json["productID"] is of type Any.
If it can be cast to a string, then convert it to an Integer.
I want to convert a Float to an Int in Swift. Basic casting like this does not work because these types are not primitives, unlike floats and ints in Objective-C
var float: Float = 2.2
var integer: Int = float as Float
But this produces the following error message:
'Float' is not convertible to 'Int'
Any idea how to property convert from Float to Int?
You can convert Float to Int in Swift like this:
var myIntValue:Int = Int(myFloatValue)
println "My value is \(myIntValue)"
You can also achieve this result with #paulm's comment:
var myIntValue = Int(myFloatValue)
Explicit Conversion
Converting to Int will lose any precision (effectively rounding down). By accessing the math libraries you can perform explicit conversions. For example:
If you wanted to round down and convert to integer:
let f = 10.51
let y = Int(floor(f))
result is 10.
If you wanted to round up and convert to integer:
let f = 10.51
let y = Int(ceil(f))
result is 11.
If you want to explicitly round to the nearest integer
let f = 10.51
let y = Int(round(f))
result is 11.
In the latter case, this might seem pedantic, but it's semantically clearer as there is no implicit conversion...important if you're doing signal processing for example.
There are lots of ways to round number with precision. You should eventually use swift's standard library method rounded() to round float number with desired precision.
To round up use .up rule:
let f: Float = 2.2
let i = Int(f.rounded(.up)) // 3
To round down use .down rule:
let f: Float = 2.2
let i = Int(f.rounded(.down)) // 2
To round to the nearest integer use .toNearestOrEven rule:
let f: Float = 2.2
let i = Int(f.rounded(.toNearestOrEven)) // 2
Be aware of the following example:
let f: Float = 2.5
let i = Int(roundf(f)) // 3
let j = Int(f.rounded(.toNearestOrEven)) // 2
Converting is simple:
let float = Float(1.1) // 1.1
let int = Int(float) // 1
But it is not safe:
let float = Float(Int.max) + 1
let int = Int(float)
Will due to a nice crash:
fatal error: floating point value can not be converted to Int because it is greater than Int.max
So I've created an extension that handles overflow:
extension Double {
// If you don't want your code crash on each overflow, use this function that operates on optionals
// E.g.: Int(Double(Int.max) + 1) will crash:
// fatal error: floating point value can not be converted to Int because it is greater than Int.max
func toInt() -> Int? {
if self > Double(Int.min) && self < Double(Int.max) {
return Int(self)
} else {
return nil
}
}
}
extension Float {
func toInt() -> Int? {
if self > Float(Int.min) && self < Float(Int.max) {
return Int(self)
} else {
return nil
}
}
}
I hope this can help someone
You can get an integer representation of your float by passing the float into the Integer initializer method.
Example:
Int(myFloat)
Keep in mind, that any numbers after the decimal point will be loss. Meaning, 3.9 is an Int of 3 and 8.99999 is an integer of 8.
Like this:
var float:Float = 2.2 // 2.2
var integer:Int = Int(float) // 2 .. will always round down. 3.9 will be 3
var anotherFloat: Float = Float(integer) // 2.0
Use a function style conversion (found in section labeled "Integer and Floating-Point Conversion" from "The Swift Programming Language."[iTunes link])
1> Int(3.4)
$R1: Int = 3
You can type cast like this:
var float:Float = 2.2
var integer:Int = Int(float)
Just use type casting
var floatValue:Float = 5.4
var integerValue:Int = Int(floatValue)
println("IntegerValue = \(integerValue)")
it will show roundoff value eg: IntegerValue = 5 means the decimal point will be loss
var floatValue = 10.23
var intValue = Int(floatValue)
This is enough to convert from float to Int
Suppose you store float value in "X" and you are storing integer value in "Y".
Var Y = Int(x);
or
var myIntValue = Int(myFloatValue)
var i = 1 as Int
var cgf = CGFLoat(i)
Most of the solutions presented here would crash for large values and should not be used in production code.
If you don't care for very large values use this code to clamp the Double to max/min Int values.
let bigFloat = Float.greatestFiniteMagnitude
let smallFloat = -bigFloat
extension Float {
func toIntTruncated() -> Int {
let maxTruncated = min(self, Float(Int.max).nextDown)
let bothTruncated = max(maxTruncated, Float(Int.min))
return Int(bothTruncated)
}
}
// This crashes:
// let bigInt = Int(bigFloat)
// this works for values up to 9223371487098961920
let bigInt = bigFloat.toIntTruncated()
let smallInt = smallFloat.toIntTruncated()
You can make a handy extension using computed property and use it from any where in project
extension Float{
var toInt : Int{
return Int(self)
}
}
Calling
private var call: Float = 55.9
print(call.toInt)
Use Int64 instead of Int. Int64 can store large int values.