Related
I'm trying to separate the decimal and integer parts of a double in swift. I've tried a number of approaches but they all run into the same issue...
let x:Double = 1234.5678
let n1:Double = x % 1.0 // n1 = 0.567800000000034
let n2:Double = x - 1234.0 // same result
let n3:Double = modf(x, &integer) // same result
Is there a way to get 0.5678 instead of 0.567800000000034 without converting to the number to a string?
You can use truncatingRemainder and 1 as the divider.
Returns the remainder of this value divided by the given value using truncating division.
Apple doc
Example:
let myDouble1: Double = 12.25
let myDouble2: Double = 12.5
let myDouble3: Double = 12.75
let remainder1 = myDouble1.truncatingRemainder(dividingBy: 1)
let remainder2 = myDouble2.truncatingRemainder(dividingBy: 1)
let remainder3 = myDouble3.truncatingRemainder(dividingBy: 1)
remainder1 -> 0.25
remainder2 -> 0.5
remainder3 -> 0.75
Same approach as Alessandro Ornano implemented as an instance property of FloatingPoint protocol:
Xcode 11 • Swift 5.1
import Foundation
extension FloatingPoint {
var whole: Self { modf(self).0 }
var fraction: Self { modf(self).1 }
}
1.2.whole // 1
1.2.fraction // 0.2
If you need the fraction digits and preserve its precision digits you would need to use Swift Decimal type and initialize it with a String:
extension Decimal {
func rounded(_ roundingMode: NSDecimalNumber.RoundingMode = .plain) -> Decimal {
var result = Decimal()
var number = self
NSDecimalRound(&result, &number, 0, roundingMode)
return result
}
var whole: Decimal { rounded(sign == .minus ? .up : .down) }
var fraction: Decimal { self - whole }
}
let decimal = Decimal(string: "1234.99999999")! // 1234.99999999
let fractional = decimal.fraction // 0.99999999
let whole = decimal.whole // 1234
let sum = whole + fractional // 1234.99999999
let negativeDecimal = Decimal(string: "-1234.99999999")! // -1234.99999999
let negativefractional = negativeDecimal.fraction // -0.99999999
let negativeWhole = negativeDecimal.whole // -1234
let negativeSum = negativeWhole + negativefractional // -1234.99999999
Swift 2:
You can use:
modf(x).1
or
x % floor(abs(x))
Without converting it to a string, you can round up to a number of decimal places like this:
let x:Double = 1234.5678
let numberOfPlaces:Double = 4.0
let powerOfTen:Double = pow(10.0, numberOfPlaces)
let targetedDecimalPlaces:Double = round((x % 1.0) * powerOfTen) / powerOfTen
Your output would be
0.5678
Swift 5.1
let x:Double = 1234.5678
let decimalPart:Double = x.truncatingRemainder(dividingBy: 1) //0.5678
let integerPart:Double = x.rounded(.towardZero) //1234
Both of these methods return Double value.
if you want an integer number as integer part, you can just use
Int(x)
Use Float since it has less precision digits than Double
let x:Double = 1234.5678
let n1:Float = Float(x % 1) // n1 = 0.5678
There’s a function in C’s math library, and many programming languages, Swift included, give you access to it. It’s called modf, and in Swift, it works like this
// modf returns a 2-element tuple,
// with the whole number part in the first element,
// and the fraction part in the second element
let splitPi = modf(3.141592)
splitPi.0 // 3.0
splitPi.1 // 0.141592
You can create an extension like below,
extension Double {
func getWholeNumber() -> Double {
return modf(self).0
}
func getFractionNumber() -> Double {
return modf(self).1
}
}
You can get the Integer part like this:
let d: Double = 1.23456e12
let intparttruncated = trunc(d)
let intpartroundlower = Int(d)
The trunc() function truncates the part after the decimal point and the Int() function rounds to the next lower value. This is the same for positive numbers but a difference for negative numbers. If you subtract the truncated part from d, then you will get the fractional part.
func frac (_ v: Double) -> Double
{
return (v - trunc(v))
}
You can get Mantissa and Exponent of a Double value like this:
let d: Double = 1.23456e78
let exponent = trunc(log(d) / log(10.0))
let mantissa = d / pow(10, trunc(log(d) / log(10.0)))
Your result will be 78 for the exponent and 1.23456 for the Mantissa.
Hope this helps you.
It's impossible to create a solution that will work for all Doubles. And if the other answers ever worked, which I also believe is impossible, they don't anymore.
let _5678 = 1234.5678.description.drop { $0 != "." } .description // ".5678"
Double(_5678) // 0.5678
let _567 = 1234.567.description.drop { $0 != "." } .description // ".567"
Double(_567) // 0.5669999999999999
extension Double {
/// Gets the decimal value from a double.
var decimal: Double {
Double("0." + string.split(separator: ".").last.string) ?? 0.0
}
var string: String {
String(self)
}
}
This appears to solve the Double precision issues.
Usage:
print(34.46979988898988.decimal) // outputs 0.46979988898988
print(34.46.decimal) // outputs 0.46
My function is converting a string to Decimal
func getDecimalFromString(_ strValue: String) -> NSDecimalNumber {
let formatter = NumberFormatter()
formatter.maximumFractionDigits = 1
formatter.generatesDecimalNumbers = true
return formatter.number(from: strValue) as? NSDecimalNumber ?? 0
}
But it is not working as per expectation. Sometimes it's returning like
Optional(8.300000000000001)
Optional(8.199999999999999)
instead of 8.3 or 8.2. In the string, I have value like "8.3" or "8.2" but the converted decimal is not as per my requirements. Any suggestion where I made mistake?
Setting generatesDecimalNumbers to true does not work as one might expect. The returned value is an instance of NSDecimalNumber (which can represent the value 8.3 exactly), but apparently the formatter converts the string to a binary floating number first (and that can not represent 8.3 exactly). Therefore the returned decimal value is only approximately correct.
That has also been reported as a bug:
NSDecimalNumbers from NSNumberFormatter are affected by binary approximation error
Note also that (contrary to the documentation), the maximumFractionDigits property has no effect when parsing a string
into a number.
There is a simple solution: Use
NSDecimalNumber(string: strValue) // or
NSDecimalNumber(string: strValue, locale: Locale.current)
instead, depending on whether the string is localized or not.
Or with the Swift 3 Decimal type:
Decimal(string: strValue) // or
Decimal(string: strValue, locale: .current)
Example:
if let d = Decimal(string: "8.2") {
print(d) // 8.2
}
I would probably be inclined to just use Decimal(string:locale:), but if you want to use NumberFormatter, you would just manually round it.
func getDecimalFromString(_ string: String) -> NSDecimalNumber {
let formatter = NumberFormatter()
formatter.generatesDecimalNumbers = true
let value = formatter.number(from: string) as? NSDecimalNumber ?? 0
return value.rounding(accordingToBehavior: RoundingBehavior(scale: 1))
}
Or if you want to return a Decimal:
func getDecimalFromString(_ string: String) -> Decimal {
let formatter = NumberFormatter()
formatter.generatesDecimalNumbers = true
let value = formatter.number(from: string) as? NSDecimalNumber ?? 0
return value.rounding(accordingToBehavior: RoundingBehavior(scale: 1)) as Decimal
}
Where
class RoundingBehavior: NSDecimalNumberBehaviors {
private let _scale: Int16
init(scale: Int16) {
_scale = scale
}
func roundingMode() -> NSDecimalNumber.RoundingMode {
.plain
}
func scale() -> Int16 {
_scale
}
func exceptionDuringOperation(_ operation: Selector, error: NSDecimalNumber.CalculationError, leftOperand: NSDecimalNumber, rightOperand: NSDecimalNumber?) -> NSDecimalNumber? {
.notANumber
}
}
I want to convert the index of a letter contained within a string to an integer value. Attempted to read the header files but I cannot find the type for Index, although it appears to conform to protocol ForwardIndexType with methods (e.g. distanceTo).
var letters = "abcdefg"
let index = letters.characters.indexOf("c")!
// ERROR: Cannot invoke initializer for type 'Int' with an argument list of type '(String.CharacterView.Index)'
let intValue = Int(index) // I want the integer value of the index (e.g. 2)
Any help is appreciated.
edit/update:
Xcode 11 • Swift 5.1 or later
extension StringProtocol {
func distance(of element: Element) -> Int? { firstIndex(of: element)?.distance(in: self) }
func distance<S: StringProtocol>(of string: S) -> Int? { range(of: string)?.lowerBound.distance(in: self) }
}
extension Collection {
func distance(to index: Index) -> Int { distance(from: startIndex, to: index) }
}
extension String.Index {
func distance<S: StringProtocol>(in string: S) -> Int { string.distance(to: self) }
}
Playground testing
let letters = "abcdefg"
let char: Character = "c"
if let distance = letters.distance(of: char) {
print("character \(char) was found at position #\(distance)") // "character c was found at position #2\n"
} else {
print("character \(char) was not found")
}
let string = "cde"
if let distance = letters.distance(of: string) {
print("string \(string) was found at position #\(distance)") // "string cde was found at position #2\n"
} else {
print("string \(string) was not found")
}
Works for Xcode 13 and Swift 5
let myString = "Hello World"
if let i = myString.firstIndex(of: "o") {
let index: Int = myString.distance(from: myString.startIndex, to: i)
print(index) // Prints 4
}
The function func distance(from start: String.Index, to end: String.Index) -> String.IndexDistance returns an IndexDistance which is just a typealias for Int
Swift 4
var str = "abcdefg"
let index = str.index(of: "c")?.encodedOffset // Result: 2
Note: If String contains same multiple characters, it will just get the nearest one from left
var str = "abcdefgc"
let index = str.index(of: "c")?.encodedOffset // Result: 2
encodedOffset has deprecated from Swift 4.2.
Deprecation message:
encodedOffset has been deprecated as most common usage is incorrect. Use utf16Offset(in:) to achieve the same behavior.
So we can use utf16Offset(in:) like this:
var str = "abcdefgc"
let index = str.index(of: "c")?.utf16Offset(in: str) // Result: 2
When searching for index like this
⛔️ guard let index = (positions.firstIndex { position <= $0 }) else {
it is treated as Array.Index. You have to give compiler a clue you want an integer
✅ guard let index: Int = (positions.firstIndex { position <= $0 }) else {
Swift 5
You can do convert to array of characters and then use advanced(by:) to convert to integer.
let myString = "Hello World"
if let i = Array(myString).firstIndex(of: "o") {
let index: Int = i.advanced(by: 0)
print(index) // Prints 4
}
To perform string operation based on index , you can not do it with traditional index numeric approach. because swift.index is retrieved by the indices function and it is not in the Int type. Even though String is an array of characters, still we can't read element by index.
This is frustrating.
So ,to create new substring of every even character of string , check below code.
let mystr = "abcdefghijklmnopqrstuvwxyz"
let mystrArray = Array(mystr)
let strLength = mystrArray.count
var resultStrArray : [Character] = []
var i = 0
while i < strLength {
if i % 2 == 0 {
resultStrArray.append(mystrArray[i])
}
i += 1
}
let resultString = String(resultStrArray)
print(resultString)
Output : acegikmoqsuwy
Thanks In advance
Here is an extension that will let you access the bounds of a substring as Ints instead of String.Index values:
import Foundation
/// This extension is available at
/// https://gist.github.com/zackdotcomputer/9d83f4d48af7127cd0bea427b4d6d61b
extension StringProtocol {
/// Access the range of the search string as integer indices
/// in the rendered string.
/// - NOTE: This is "unsafe" because it may not return what you expect if
/// your string contains single symbols formed from multiple scalars.
/// - Returns: A `CountableRange<Int>` that will align with the Swift String.Index
/// from the result of the standard function range(of:).
func countableRange<SearchType: StringProtocol>(
of search: SearchType,
options: String.CompareOptions = [],
range: Range<String.Index>? = nil,
locale: Locale? = nil
) -> CountableRange<Int>? {
guard let trueRange = self.range(of: search, options: options, range: range, locale: locale) else {
return nil
}
let intStart = self.distance(from: startIndex, to: trueRange.lowerBound)
let intEnd = self.distance(from: trueRange.lowerBound, to: trueRange.upperBound) + intStart
return Range(uncheckedBounds: (lower: intStart, upper: intEnd))
}
}
Just be aware that this can lead to weirdness, which is why Apple has chosen to make it hard. (Though that's a debatable design decision - hiding a dangerous thing by just making it hard...)
You can read more in the String documentation from Apple, but the tldr is that it stems from the fact that these "indices" are actually implementation-specific. They represent the indices into the string after it has been rendered by the OS, and so can shift from OS-to-OS depending on what version of the Unicode spec is being used. This means that accessing values by index is no longer a constant-time operation, because the UTF spec has to be run over the data to determine the right place in the string. These indices will also not line up with the values generated by NSString, if you bridge to it, or with the indices into the underlying UTF scalars. Caveat developer.
In case you got an "index is out of bounds" error. You may try this approach. Working in Swift 5
extension String{
func countIndex(_ char:Character) -> Int{
var count = 0
var temp = self
for c in self{
if c == char {
//temp.remove(at: temp.index(temp.startIndex,offsetBy:count))
//temp.insert(".", at: temp.index(temp.startIndex,offsetBy: count))
return count
}
count += 1
}
return -1
}
}
Trying to run a simple calculation via a function in my class. I simply want to add bill1 + bill2 and print the total amount spent on bills. So (bill1 + bill2 = total). And then print the total amount.
Current error states - "Code after 'return' will never be executed." Now, is my location for my print in the wrong location or did I declare my variables incorrectly? Should I be using vars instead of lets?
What do you recommend for my function in order to calculate and print the result?
class BillsCalculator
{
let nameOfBill1: String = "Medical"
let nameOfBill1: String = "Hulu"
let monthlyBillAmount1: Double = 34.25
let monthlyBillAmount2: Double = 7.99
let calculateTotalsPerMonth: Double = 0.0
//calculateTotalPerMonth ( = monthlyBillAmount_1 + monthlyBillAmount_2 + 3)
func calculateTotalsPerMonth(monthlyBillAmount: Double, monthlyBillAmount2: Double) -> Double
{
//totalBillsPerMonth = add(monthlyBillAmount1 + monthlyBillAmount2)
return totalBillsPerMonth(monthlyBillAmount1 + monthlyBillAmount2)
*Error println("You spend \(totalBillsPerMonth)")
}
}
First: "Code after 'return' will never be executed."
Yes it will not, after you call return you exit the function and return to the function that call it, you probably have an warning in XCode warning you about telling you that
Second: "Should I be using vars instead of lets"
If the value changes you MUST use var, if it does not you SHOULD use let.
Some problems I can see in your code:
class BillsCalculator
{
//use _ in the beginning of the name for class variables
//eg. _nameOfBill instead nameOfBill1
//It is not wrong use nameOfBill1 is just not recommended
//if nameOfBill1 change use var
let nameOfBill1: String = "Medical"
//Why is this declare twice
let nameOfBill1: String = "Hulu"
//Those values look like change should be var
var monthlyBillAmount1: Double = 34.25
var monthlyBillAmount2: Double = 7.99
var calculateTotalsPerMonth: Double = 0.0
func calculateTotalsPerMonth(monthlyBillAmount: Double, monthlyBillAmount2: Double) -> Double
{
totalBillsPerMonth = add(monthlyBillAmount1 + monthlyBillAmount2)
//print before return
println("You spend \(totalBillsPerMonth)")
return totalBillsPerMonth(monthlyBillAmount1 + monthlyBillAmount2)
}
}
Here you should either print the value of your total bill or return that value. As you just want to print the total bill amount so I would recommend you to just print, not to return anything. You can refer the below code.
class BillsCalculator
{
let nameOfBill1: String = "Medical"
let nameOfBill1: String = "Hulu"
let monthlyBillAmount1: Double = 34.25
let monthlyBillAmount2: Double = 7.99
let calculateTotalsPerMonth: Double = 0.0
//calculateTotalPerMonth ( = monthlyBillAmount_1 + monthlyBillAmount_2 + 3)
func calculateTotalsPerMonth(monthlyBillAmount: Double, monthlyBillAmount2: Double) -> Double
{
calculateTotalsPerMonth= add(monthlyBillAmount1 + monthlyBillAmount2)
println("You spend : "+totalBillsPerMonth);
}
}
One tiny error in your code
let nameOfBill1: String = "Medical"
let nameOfBill1: String = "Hulu"
These two variables have the same name, perhaps one should be:
let nameOfBill2: String = "Hulu"
And yes return is always the last line in the function, so any codes after return will never be executed. If you only want to get the total of two bills, you can simply do this:
func calculateTotalsPerMonth(monthlyBillAmount: Double, monthlyBillAmount2: Double) -> Double {
//println("You spend \(totalBillsPerMonth)")
return monthlyBillAmount1 + monthlyBillAmount2
}
and call this function with your bill variables, like:
let bill1 = 34.25
let bill2 = 7.99
let totalBill = calculateTotalsPerMonth(bill1, bill2)
println("You spent \(totalBill)")
Swift is a very smart language, and it is type safe. You can remove the type if you want, more like a personal programming style thing.
let bill1: Double = 34.25
let bill1 = 34.25
They both will be type "Double"
As others have said, you need to put your println statement before return since returns ends the execution of the method; thus println will never be run.
However, I would suggest a few changes to your current approach:
// A bill is an object - why not encapsulate it in a struct.
struct Bill {
let name: String
let amount: Double
}
// Using structs is generally preferred, unless you need inheritance and/or
// references to your BillsCalculator objects.
struct BillsCalculator {
let bill1: Bill
let bill2: Bill
// Using a read-only computed property means you don't need to set
// the total to have an initial value of zero.
var totalBilled: Double {
return bill1.amount + bill2.amount
}
}
// Since you're probably going to want to reuse BillsCalculator,
// don't have each bill set already. Instead, use BillsCalculator's
// initialiser and pass in bills.
let bill1 = Bill(name: "Medical", amount: 34.25)
let bill2 = Bill(name: "Hulu", amount: 7.99)
let cal = BillsCalculator(bill1: bill1, bill2: bill2)
print("You've spend \(cal.totalBilled) this month")
Before I updated xCode 6, I had no problems casting a double to a string but now it gives me an error
var a: Double = 1.5
var b: String = String(a)
It gives me the error message "double is not convertible to string". Is there any other way to do it?
It is not casting, it is creating a string from a value with a format.
let a: Double = 1.5
let b: String = String(format: "%f", a)
print("b: \(b)") // b: 1.500000
With a different format:
let c: String = String(format: "%.1f", a)
print("c: \(c)") // c: 1.5
You can also omit the format property if no formatting is needed.
let double = 1.5
let string = double.description
update Xcode 7.1 • Swift 2.1:
Now Double is also convertible to String so you can simply use it as you wish:
let double = 1.5
let doubleString = String(double) // "1.5"
Swift 3 or later we can extend LosslessStringConvertible and make it generic
Xcode 11.3 • Swift 5.1 or later
extension LosslessStringConvertible {
var string: String { .init(self) }
}
let double = 1.5
let string = double.string // "1.5"
For a fixed number of fraction digits we can extend FloatingPoint protocol:
extension FloatingPoint where Self: CVarArg {
func fixedFraction(digits: Int) -> String {
.init(format: "%.*f", digits, self)
}
}
If you need more control over your number format (minimum and maximum fraction digits and rounding mode) you can use NumberFormatter:
extension Formatter {
static let number = NumberFormatter()
}
extension FloatingPoint {
func fractionDigits(min: Int = 2, max: Int = 2, roundingMode: NumberFormatter.RoundingMode = .halfEven) -> String {
Formatter.number.minimumFractionDigits = min
Formatter.number.maximumFractionDigits = max
Formatter.number.roundingMode = roundingMode
Formatter.number.numberStyle = .decimal
return Formatter.number.string(for: self) ?? ""
}
}
2.12345.fractionDigits() // "2.12"
2.12345.fractionDigits(min: 3, max: 3, roundingMode: .up) // "2.124"
In addition to #Zaph's answer, you can create an extension on Double:
extension Double {
func toString() -> String {
return String(format: "%.1f",self)
}
}
Usage:
var a:Double = 1.5
println("output: \(a.toString())") // output: 1.5
Swift 3+: Try these line of code
let num: Double = 1.5
let str = String(format: "%.2f", num)
to make anything a string in swift except maybe enum values simply do what you do in the println() method
for example:
var stringOfDBL = "\(myDouble)"
There are many answers here that suggest a variety of techniques. But when presenting numbers in the UI, you invariably want to use a NumberFormatter so that the results are properly formatted, rounded, and localized:
let value = 10000.5
let formatter = NumberFormatter()
formatter.numberStyle = .decimal
guard let string = formatter.string(for: value) else { return }
print(string) // 10,000.5
If you want fixed number of decimal places, e.g. for currency values
let value = 10000.5
let formatter = NumberFormatter()
formatter.numberStyle = .decimal
formatter.maximumFractionDigits = 2
formatter.minimumFractionDigits = 2
guard let string = formatter.string(for: value) else { return }
print(string) // 10,000.50
But the beauty of this approach, is that it will be properly localized, resulting in 10,000.50 in the US but 10.000,50 in Germany. Different locales have different preferred formats for numbers, and we should let NumberFormatter use the format preferred by the end user when presenting numeric values within the UI.
Needless to say, while NumberFormatter is essential when preparing string representations within the UI, it should not be used if writing numeric values as strings for persistent storage, interface with web services, etc.
Swift 4:
Use following code
let number = 2.4
let string = String(format: "%.2f", number)
This function will let you specify the number of decimal places to show:
func doubleToString(number:Double, numberOfDecimalPlaces:Int) -> String {
return String(format:"%."+numberOfDecimalPlaces.description+"f", number)
}
Usage:
let numberString = doubleToStringDecimalPlacesWithDouble(number: x, numberOfDecimalPlaces: 2)
In swift 3:
var a: Double = 1.5
var b: String = String(a)
In swift 3 it is simple as given below
let stringDouble = String(describing: double)
I would prefer NSNumber and NumberFormatter approach (where need), also u can use extension to avoid bloating code
extension Double {
var toString: String {
return NSNumber(value: self).stringValue
}
}
U can also need reverse approach
extension String {
var toDouble: Double {
return Double(self) ?? .nan
}
}
var b = String(stringInterpolationSegment: a)
This works for me. You may have a try
In Swift 4 if you like to modify and use a Double in the UI as a textLabel "String" you can add this in the end of your file:
extension Double {
func roundToInt() -> Int{
return Int(Darwin.round(self))
}
}
And use it like this if you like to have it in a textlabel:
currentTemp.text = "\(weatherData.tempCelsius.roundToInt())"
Or print it as an Int:
print(weatherData.tempCelsius.roundToInt())
Swift 5:
Use following code
extension Double {
func getStringValue(withFloatingPoints points: Int = 0) -> String {
let valDouble = modf(self)
let fractionalVal = (valDouble.1)
if fractionalVal > 0 {
return String(format: "%.*f", points, self)
}
return String(format: "%.0f", self)
}
}
You shouldn't really ever cast a double to a string, the most common reason for casting a float to a string is to present it to a user, but floats are not real number and can only approximate lots of values, similar to how ⅓ can not be represented as a decimal number with a finite number of decimal places. Instead keep you values as float for all their use, then when you want to present them to the user, use something like NumberFormater to convert them for your. This stage of converting for user presentation is what something like your viewModel should do.
Use this.
Text(String(format: "%.2f", doubleValue))