Basic Maths in Swift - swift

Im very new to Xcode and am trying to make a simple app that calculates Gross Profit.
I am trying to use the following code but it returns the value '0'.
any ideas why?
// Playground - noun: a place where people can play
import UIKit
var costPrice = 10
var salePrice = 100
var grossProfit = ((salePrice - costPrice) / salePrice) * 100
println(grossProfit)

This is all explained in the first few pages of the iBook "Introduction to Swift" that is free and published by Apple.
Swift is type safe and will infer type from context.
The line var costPrice = 10 is inferring that the variable costPrice is an int.
You then can't implicitly combine ints with other types of numbers (doubles for instance).
If you try this..
let costPrice = 10.0
let salePrice = 100.0
let grossProfit = ((salePrice - costPrice) / salePrice) * 100.0
You will find this works.

10 and 100 are integers, so costPrice and salePrice are integers. Integer division truncates as you're seeing. You wanted to use 10.0 and 100.0 here.

Related

Error for writing algorithms on the swift

I've been working on nutrition app for ios lately. I'm trying to write this algorithm
weight (lb) / [height (in)]2 x 703
in xcode with swift, but I'm beginner to use swift. I wrote this code
let calculate = ((textforweight.text) / ((textforheight.text)*(textforheight.text))*703)
and it gives me error like "Type of expression is ambiguous without more context" that. Can you help me for this. It might be so easy but I just don't know how to do it
UITextField .text property return a String value.So you need to convert this value to Int first
let weight = Int(textforweight.text!) ?? .zero
let height = Int(textforheight.text!) ?? .zero
Now You can Calculate you result,
let result = weight / height * height * 703
UITextField .text Property Documentation

Why subtracting NSDecimalNumbers do not work - Swift 3

Why am I getting 0 when subtracting 5.0 from 650.50 using the subtracting() method?
in the following code, adding, multiplying and dividing work fine, but why subtracting doesn't? What am I doing wrong?
See code in IBM's sandbox:
http://swift.sandbox.bluemix.net/#/repl/59b1387696a0602d6cb19201
import Foundation
let num1:NSDecimalNumber = 650.50
let num2:NSDecimalNumber = 5.0
let result = num1.adding(num2)
let result2 = num1.subtracting(num2)
let result3 = num1.multiplying(by: num2)
let result4 = num1.dividing(by: num2)
print("Addition: \(result)") // Addition: 655.5
// Why am I getting 0 here and not 645.5?
print("Subtraction: \(result2)") //Subtraction: 0
print("Multiplication: \(result3)") //Multiplication: 3252.5
print("Division: \(result4)") //Division: 130.1
Apples Docs:
https://developer.apple.com/documentation/foundation/nsdecimalnumber
This may be because of a specificity of the IBM sandbox related to NSDecimalNumber (indeed many parts of Foundation are still not entirely available on Linux).
Anyway, whatever the bug is, a solution is to use the Swift counterpart to NSDecimalNumber which is Decimal.
Despite the fact that this is supposed to be only a wrapper around NSDecimalNumber, it gives the correct result, even on the IBM platform.
Note that this wrapper doesn't use the NSDecimalNumber methods, it uses Swift operators such as + or *.
import Foundation
let num1: Decimal = 650.50
let num2: Decimal = 5.0
let result = num1 + num2
let result2 = num1 - num2
let result3 = num1 * num2
let result4 = num1 / num2
print("Addition: \(result)")
print("Subtraction: \(result2)")
print("Multiplication: \(result3)")
print("Division: \(result4)")
Gives:
Addition: 655.5
Subtraction: 645.5
Multiplication: 3252.5
Division: 130.1
Your code isn't wrong and works correctly in Xcode/macOS. However, IBM's Swift sandbox uses Linux, and the implementation of Foundation on Linux has issues. In the Status page of the repo, NSDecimalNumber is marked as "Unimplemented". Therefore, it may have some problems. Use classes from the standard library instead.

Round currency closest to five

I'd like to round my values to the closest of 5 cent for example:
5.31 -> 5.30
5.35 -> 5.35
5.33 -> 5.35
5.38 -> 5.40
Currently I'm doing it by getting the decimal values using:
let numbers = 5.33
let decimal = (numbers - rint(numbers)) * 100
let rounded = rint(numbers) + (5 * round(decimal / 5)) / 100
// This results in 5.35
I was wondering if there's a better method with fewer steps because sometimes numbers - rint(numbers) is giving me a weird result like:
let numbers = 12.12
let decimal = (numbers - rint(numbers)) * 100
// This results in 11.9999999999999
Turns out..it's really simple
let x: Float = 1.03 //or whatever value, you can also use the Double type
let y = round(x * 20) / 20
It's really better to stay away from floating-point for this kind of thing, but you can probably improve the accuracy a little with this:
import Foundation
func roundToFive(n: Double) -> Double {
let f = floor(n)
return f + round((n-f) * 20) / 20
}
roundToFive(12.12) // 12.1
I will use round function and NSNumberFormatter also but slightly different algorithm
I was thinking about using % but I changed it to /
let formatter = NSNumberFormatter()
formatter.minimumFractionDigits = 2
formatter.maximumFractionDigits = 2
//5.30
formatter.stringFromNumber(round(5.31/0.05)*0.05)
//5.35
formatter.stringFromNumber(round(5.35/0.05)*0.05)
//5.35
formatter.stringFromNumber(round(5.33/0.05)*0.05)
//5.40
formatter.stringFromNumber(round(5.38/0.05)*0.05)
//12.15
formatter.stringFromNumber(round(12.13/0.05)*0.05)
Depending on how you are storing your currency data, I would recommend using a dictionary or an array to look up the original cents value, and return a pre-computed result. There's no reason to do the calculations at all, since you know that 0 <= cents < 100.
If your currency is a string input, just chop off the last couple of digits and do a dictionary lookup.
round_cents = [ ... "12":"10", "13":"15", ... ]
If your currency is a floating point value, well, you have already discovered the joys of trying to do that. You should change it.
If your currency is a data type, or a fixed point integer, just get the cents part out and do an array lookup.
...
round_cents[12] = 10
round_cents[13] = 15
...
In either case, you would then do:
new_cents = round_cents[old_cents]
and be done with it.

binary operator / cannot be applied to operands of type Int and Double [duplicate]

This question already has answers here:
Multiplying variables and doubles in swift
(2 answers)
So if string is not NilLiteralConvertible... what do some string functions return?
(1 answer)
Closed 7 years ago.
Hello brand new to Swift, and programming in general. Going through an exercise the code given is exactly:
//: Playground - noun: a place where people can play
import UIKit
let height = 12
let width = 10
let area = height * width
let areaInMeters = area / 10.762
But I get the error, "binary operator / cannot be applied to operands of type Int and Double".
After some digging around I found you can't operate on both an Integer and a Double. So I changed the last line to:
let areaInMeters = (Double)area / 10.762
Then I get the error, "Consecutive statements on a line must be separated by a ;" and it wants me to put the ; after area. None of this is making any sense to me.
Using El Capitan beta and Xcode 7 beta.
height and width will both be inferred as of type Int. Therefore area is also of type Int whilst 10.762 is a Double.
And in Swift safety is paramount so you'll need to have both operands of same type.
Solution is (as Eric D. suggested) is to convert area to a Double:
let areaInMeters = Double(area) / 10.762
Try instead adding a decimal point and a zero to the end of your height and width.
Like so:
let height = 12.0
let width = 10.0
And you won't have to worry about having to deal with an Integer.
Hope this helps. Happy Coding!

How to find max value for Double and Float in Swift

Current learning Swift, there are ways to find max and min value for different kind of Integer like Int.max and Int.min.
Is there a way to find max value for Double and Float? Moreover, which document should I refer for this kind of question? I am currently reading Apple's The Swift Programming Language.
As of Swift 3+, you should use:
CGFloat.greatestFiniteMagnitude
Double.greatestFiniteMagnitude
Float.greatestFiniteMagnitude
While there’s no Double.max, it is defined in the C float.h header, which you can access in Swift via import Darwin.
import Darwin
let fmax = FLT_MAX
let dmax = DBL_MAX
These are roughly 3.4 * 10^38 and 1.79 * 10^308 respectively.
But bear in mind it’s not so simple with floating point numbers (it’s never simple with floating point numbers). When holding numbers this large, you lose precision in a similar way to losing precision with very small numbers, so:
let d = DBL_MAX
let e = d - 1.0
let diff = d - e
diff == 0.0 // true
let maxPlusOne = DBL_MAX + 1
maxPlusOne == d // true
let inf = DBL_MAX * 2
// perhaps infinity is the “maximum”
inf == Double.infinity // true
So before you get into some calculations that might possibly brush up against these limits, you should probably read up on floating point. Here and here are probably a good start.
AV's answer is fine, but I find those macros hard to remember and a bit non-obvious, so eventually I made Double.MIN and friends work:
extension Double {
static var MIN = -DBL_MAX
static var MAX_NEG = -DBL_MIN
static var MIN_POS = DBL_MIN
static var MAX = DBL_MAX
}
Don't use lowercase min and max -- those symbols are used in Swift 3.
Just write
let mxFloat = MAXFLOAT
You will get the maximum value of a float in Swift.
Also CGFloat.infinity, Double.infinity or just .infinity can be useful in such situations.
Works with swift 5
public extension Double {
/// Max double value.
static var max: Double {
return Double(greatestFiniteMagnitude)
}
/// Min double value.
static var min: Double {
return Double(-greatestFiniteMagnitude)
}
}