I am certain I am missing something very, very obvious, but can anyone tell me why I am having trouble multiplying two Integers? The following code:
let twenty: Integer = 20
let ten: Integer = 10
let result: Integer = twenty * ten
presents the error Could not find an overload for '*' that accepts the supplied arguments.
Other questions on SO with the same error are caused by trying to multiply different types together, but surely these are both Integer types?
(PS: The actual code I am trying to run is var value = self.value * 10 but I have expanded it to the sample while debugging to make absolutely sure that the correct types are being used)
use Int instead. Integer is a protocol.
Integer is a protocol not a type. Use Int instead.
As already stated , Integer is a protocol not a type .
In your situation, you don't need to do explicit the type because it is of implicit casting.
This could be enough
let twenty = 20
let ten = 10
let result = twenty * ten
NSLog("%d", result)
Related
So I am following this course called "Code With Chris - 14 Day Beginner Challenge (SwiftUI)" (yes I am a beginner), and after each lesson, there is a challenge, I have almost completed the challenge but I couldn't figure out why it wouldn't work, so I checked the dropbox of the completed challenge and I had everything pretty much the same, I have found a solution similar to the source but I still don't understand why my first version (first picture) won't work. I copied everything identically from the source code and it won't work. Is there a possibility that it is the creators of the source code fault, instead of mine?
My expected result is for the "Int" to work just like the "Double" did, The number of people is 5 so I don't see why it wouldn't.
My actual result is an error.
My goal is to complete this challenge:
We’re going to be trying out some math operations in a Swift Playground.
Open Xcode and create a new playground
(File Menu->New->Playground).
From the list of Playground templates, just select “Blank”
Challenge 1
Declare a struct called TaxCalculator
Declare a property inside called tax and set it to a decimal value representing the amount of sales tax where you live
Declare a method inside called totalWithTax that accepts a Double as an input parameter and returns a Double value.
Inside that method, write the code to return a Double value representing the input number with tax included
Challenge 2
Declare a struct called BillSplitter
Declare a method inside called splitBy that:
has an input parameter of type Double representing a subtotal
has an input parameter of type Int representing the number of people
returns a Double value
Inside that method, use an instance of TaxCalculator (from challenge 1 above) to calculate the total with tax and then split the bill by the number of people passed into the method.
Return the amount that each person has to pay.
Challenge 3
Create an instance of BillSplitter
Use the instance to print out the amount that each person pays (Assuming 5 people with a bill of $120)
The Code of the course I am using:
https://www.dropbox.com/sh/7aopencivoiegz4/AADbxSj83wt6mPNNgYcARFAsa/Lesson%2009?dl=0&file_subpath=%2FL9+Challenge+Solution.playground%2FContents.swift&preview=L9+Challenge+Solution.zip&subfolder_nav_tracking=1
an image of the code with an error
an image of the code without an error
//https://learn.codewithchris.com/courses/take/start/texts/18867185-lesson-9-challenge
//Challenge1
struct TaxCalculator{
var tax = 0.15
func totalWithTax(_ subtotal:Double) -> Double{
return subtotal * (1 + tax)
}
}
//Challenge2
struct BillSplitter {
func splitBy(subtotal:Double, numPeople:Int //here is the problem) ->Double {
let taxCalc = TaxCalculator()
let totalWithTax = taxCalc.totalWithTax(subtotal)
return totalWithTax/numPeople
}
}
let Split = BillSplitter()
print(Split.splitBy(subtotal: 120, numPeople: 5))
totalWithTax is a Double. numPeople is an Int.
You need to convert numPeople to a Double too.
return totalWithTax / Double(numPeople)
Operators like / don't work with mismatching types.
Swift is a bit of a pain with scalar types. Most C family languages will quietly "promote" scalar types to other types as long as there is no loss of data.
byte->int->long int->float->double all happen silently.
In C, this code just works:
int a = 2;
double b = 2.5;
double c = a * b;
The value a gets promoted to a double, and the result is that contains the double value 5.0.
Not so with Swift.
In Swift, you have to explicitly cast a to a double. It won't let you multiply an Int and a Double unless you explicitly cast the Int to a Double, as aheze said in their answer:
return totalWithTax / Double(numPeople)
Basically I would like to know if it is possible to get the exponent value in a number, ex:
number = 2.6e3
I want to get the value 3 of the exponent. I have been searching for quite a while now and have not found the answer to this. I am new to programming so I may not know exactly what to look for (which methods, etc).
Any help is much appreciated! Thanks!
Assuming I am interpreting your question correctly this is what you want to do:
B = A^X where A and B are known values. Solve for X.
1000 = 10^X (In this case, X = 3.)
The below code will work for any base. It requires either Foundation or UIKit. The function arguments "value" and "base" are B, A respectively. Try the code out in the Xcode Playground!
func getExponentForValueAndBase(value: Double, base: Double) -> Double {
return log(value)/log(base)
}
getExponentForValueAndBase(1000, base: 10) // = 3
Assuming this is your question: Given a number as an integer, find the interger value of the log base 10 of it.
import Foundation
func log(Int number) -> Int
{
return floor(log10(number))
}
Looking at various posts on this topic but still no luck. Is there a simple way to make division/conversion when dividing Double (or Float) with Int? Here is a simple example in playground returning and error "Double is not convertible to UInt8".
var score:Double = 3.00
var length:Int = 2 // it is taken from some an array lenght and does not return decimal or float
var result:Double = (score / length )
Cast the int to double with var result:Double=(score/Double(length))
What this will do is before computing the division it will create a new Double variable with int inside parentheses hence constructor like syntax.
You cannot combine or use different variable types together.
You need to convert them all to the same type, to be able to divide them together.
The easiest way I see to make that happen, would be to make the Int a Double.
You can do that quite simply do that by adding a ".0" on the end of the Integer you want to convert.
Also, FYI:
Floats are pretty rarely used, so unless you're using them for something specific, its also just more fluid to use more common variables.
I need to convert this 50 digit string 53503534226472524250874054075591789781264330331690 into the appropriate number type. I tried this:
let str = "53503534226472524250874054075591789781264330331690"
let num = str.toInt(); // Returns nil
let num = Int64(str.toInt()); // Errors out
The maximimum size of an Int64 is 9,223,372,036,854,775,807 when it is signed. So you cannot convert it just like that.
You need something like the BigInt class found in other languages. Check this other question where they answer with alternatives about BigInt in Swift:
BigInteger equivalent in Swift?
In summary, there are third-party libraries out there for arbitrary long integers. The only alternative from Apple is NSDecimalNumber but its limit is 38 digits, whereas your number has 50.
I am confused by what is returned when performing number operations in Swift between various types. Consider the following:
var castedFoo = Float(7.0/5.0) // returns 1.39999997...
var specifiedTypeFoo:Float = 7/5.0 //returns 1.39999997...
var foo = (7/5.0) //returns 1.4
What separates the first two from the last one? They are all returning floats, so why is the value from the last one rounded? I understand that the first is casted and the second explicitly specified to be a Float, but the last one also returns a Float value. So what makes the difference here?
According to Swift documentation,
Unless otherwise specified, the default type of a floating-point literal is the Swift standard library type Double, which represents a 64-bit floating-point number.
In other words, the literal 5.0 is of type Double.
Your first two examples set the result type to Float; your last example keeps the type of the result a Double, because the result of the division of an Int and a Double is a Double. Because of that difference, the last result has higher precision.