Read a double * in Swift - swift

I have an Obj-C method that returns a double *, how is this accessed in Swift as a Double?
I get this error
Cannot convert value of type 'UnsafeMutablePointer<Double>' to expected argument type 'Double'
I am calling this - (double * _Nonnull) modIntensityForDestination:(int) destination;
and failing when I do this
let intensity = audioEngine.modIntensityForDestination(Int32(modDestinationID))
I have tried withUnsafeMutablePointer but cant seem to get it working.

If the method returns a pointer to a single floating point number
then you can dereference it with .memory:
let intensity = audioEngine.modIntensityForDestination(...).memory
In Swift 3 it would be .pointee.

Related

Why Int and Float literals are allowed to be added, but Int and Float variables are not allowed to do the same in Swift?

I tried adding an Int and Float literal in Swift and it compiled without any error :
var sum = 4 + 5.0 // sum is assigned with value 9.0 and type Double
But, when I tried to do the same with Int and Float variables, I got a compile-time error and I had to type-cast any one operand to the other one's type for it to work:
var i: Int = 4
var f:Float = 5.0
var sum = i + f // Binary operator '+' cannot be applied to operands of type 'Int' and 'Float'
Why is it happening so ? Is it related to type safety in any way ?
If you want Double result:
let i: Int = 4
let f: Float = 5.0
let sum = Double(i) + Double(f)
print("This is the sum:", sum)
If you want Int result:
let i: Int = 4
let f: Float = 5.0
let sum = i + Int(f)
print("This is the sum:", sum)
In case of var sum = 4 + 5.0 the compiler automatically converts 4 to a float as that is what is required to perform the operation.
Same happens if you write var x: Float = 4. The 4 is automatically converted to a float.
In second case, since you have explicitly defined the type of the variable, the compiler does not have the freedom to change is as per the requirement.
For solution, look at #Fabio 's answer
The document on Swift.org says:
Type inference is particularly useful when you declare a constant or variable with an initial value. This is often done by assigning a literal value (or literal) to the constant or variable at the point that you declare it. (A literal value is a value that appears directly in your source code, such as 42 and 3.14159 in the examples below.)
For example, if you assign a literal value of 42 to a new constant
without saying what type it is, Swift infers that you want the
constant to be an Int, because you have initialized it with a number
that looks like an integer:
let meaningOfLife = 42 // meaningOfLife is inferred to be of type Int
Likewise, if you don’t specify a type for a floating-point literal,
Swift infers that you want to create a Double:
let pi = 3.14159 // pi is inferred to be of type Double Swift always
chooses Double (rather than Float) when inferring the type of
floating-point numbers.
If you combine integer and floating-point literals in an expression, a
type of Double will be inferred from the context:
> let anotherPi = 3 + 0.14159 // anotherPi is also inferred to be of
type Double The literal value of 3 has no explicit type in and of
itself, and so an appropriate output type of Double is inferred from
the presence of a floating-point literal as part of the addition.

Cannot convert value of type 'Int' to expected argument type 'Double'

So I am following this course called "Code With Chris - 14 Day Beginner Challenge (SwiftUI)" (yes I am a beginner), and after each lesson, there is a challenge, I have almost completed the challenge but I couldn't figure out why it wouldn't work, so I checked the dropbox of the completed challenge and I had everything pretty much the same, I have found a solution similar to the source but I still don't understand why my first version (first picture) won't work. I copied everything identically from the source code and it won't work. Is there a possibility that it is the creators of the source code fault, instead of mine?
My expected result is for the "Int" to work just like the "Double" did, The number of people is 5 so I don't see why it wouldn't.
My actual result is an error.
My goal is to complete this challenge:
We’re going to be trying out some math operations in a Swift Playground.
Open Xcode and create a new playground
(File Menu->New->Playground).
From the list of Playground templates, just select “Blank”
Challenge 1
Declare a struct called TaxCalculator
Declare a property inside called tax and set it to a decimal value representing the amount of sales tax where you live
Declare a method inside called totalWithTax that accepts a Double as an input parameter and returns a Double value.
Inside that method, write the code to return a Double value representing the input number with tax included
Challenge 2
Declare a struct called BillSplitter
Declare a method inside called splitBy that:
has an input parameter of type Double representing a subtotal
has an input parameter of type Int representing the number of people
returns a Double value
Inside that method, use an instance of TaxCalculator (from challenge 1 above) to calculate the total with tax and then split the bill by the number of people passed into the method.
Return the amount that each person has to pay.
Challenge 3
Create an instance of BillSplitter
Use the instance to print out the amount that each person pays (Assuming 5 people with a bill of $120)
The Code of the course I am using:
https://www.dropbox.com/sh/7aopencivoiegz4/AADbxSj83wt6mPNNgYcARFAsa/Lesson%2009?dl=0&file_subpath=%2FL9+Challenge+Solution.playground%2FContents.swift&preview=L9+Challenge+Solution.zip&subfolder_nav_tracking=1
an image of the code with an error
an image of the code without an error
//https://learn.codewithchris.com/courses/take/start/texts/18867185-lesson-9-challenge
//Challenge1
struct TaxCalculator{
var tax = 0.15
func totalWithTax(_ subtotal:Double) -> Double{
return subtotal * (1 + tax)
}
}
//Challenge2
struct BillSplitter {
func splitBy(subtotal:Double, numPeople:Int //here is the problem) ->Double {
let taxCalc = TaxCalculator()
let totalWithTax = taxCalc.totalWithTax(subtotal)
return totalWithTax/numPeople
}
}
let Split = BillSplitter()
print(Split.splitBy(subtotal: 120, numPeople: 5))
totalWithTax is a Double. numPeople is an Int.
You need to convert numPeople to a Double too.
return totalWithTax / Double(numPeople)
Operators like / don't work with mismatching types.
Swift is a bit of a pain with scalar types. Most C family languages will quietly "promote" scalar types to other types as long as there is no loss of data.
byte->int->long int->float->double all happen silently.
In C, this code just works:
int a = 2;
double b = 2.5;
double c = a * b;
The value a gets promoted to a double, and the result is that contains the double value 5.0.
Not so with Swift.
In Swift, you have to explicitly cast a to a double. It won't let you multiply an Int and a Double unless you explicitly cast the Int to a Double, as aheze said in their answer:
return totalWithTax / Double(numPeople)

Swift-Binary operator cannot be applied to operands, when converting degrees to radians

I'm aware of some relatively similar questions on this site, but if they do apply to my problem (which I'm not certain they do) then I certainly don't understand them. Here's my problem;
var degrees = UInt32()
var radians = Double()
let degrees:UInt32 = arc4random_uniform(360)
let radians = angle * (M_PI / 180)
This returns an error, focused on the multiplication star, reading; "Binary operator "*" cannot be applied to operands of type 'UInt32' and 'Double'.
I'm fairly sure I need to have the degrees variable be of type UInt32 to randomise it, and also that the pi constant cannot be made to be of UInt32, or at least I don't know how, as I'm relatively new to Xcode and Swift in general.
I'd be very grateful if anyone had a solution to my problem.
Thanks in advance.
let degree = arc4random_uniform(360)
let radian = Double(degree) * .pi/180
you need to convert the degree to double before the multiplication .
from apple swift book:
Integer and Floating-Point Conversion
Conversions between integer and floating-point numeric types must be made explicit:
let three = 3
let pointOneFourOneFiveNine = 0.14159
let pi = Double(three) + pointOneFourOneFiveNine
// pi equals 3.14159, and is inferred to be of type Double
Here, the value of the constant three is used to create a new value of type Double, so that both sides of
the addition are of the same type. Without this conversion in place, the addition would not be allowed.
Floating-point to integer conversion must also be made explicit. An integer type can be initialized
with a Double or Float value:
1 let integerPi = Int(pi)
2 // integerPi equals 3, and is inferred to be of type Int
Floating-point values are always truncated when used to initialize a new integer value in this way.
This means that 4.75 becomes 4, and -3.9 becomes -3.

Swift operator "*" throwing error on two Ints

I have a very odd error here and i've searched all around and i have tried all the suggestions. None work.
scrollView.contentSize.height = 325 * globals.defaults.integer(forKey: "numCards")
Binary operator '*' cannot be applied to two 'Int' operands
WTF Swift! Why not? I multiply Ints all the time. These ARE two Ints. globals.defaults is just an instance of UserDefaults.standard. I have tried the following with the same error each time.
325 * Int(globals.defaults.integer(forKey: "numCards") //NOPE
Int(325) * Int(globals.defaults.integer(forKey: "numCards")) //NOPE
if let h = globals.defaults.integer(forKey: "numCards"){
325 * h //NOPE, and 'Initializer for conditional binding must have optional type, not Int'
}
let h = globals.defaults.integer(forKey: "numCards") as! Int
325 * h //NOPE, and 'Forced cast of Int of same type as no affect'
325 * 2 //YES! But no shit...
All of those "attempts" seemed like a waste of time as i know for a fact both of these are Ints...and i was correct. Please advise. Thanks!
The error is misleading. The problem is actually the attempt to assign an Int value to a CGFloat variable.
This will work:
scrollView.contentSize.height = CGFloat(325 * globals.defaults.integer(forKey: "numCards"))
The cause of the misleading error (thanks to Daniel Hall in the comments below) is due to the compiler choosing the * function that returns a CGFloat due to the return value needed. This same function expects two CGFloat parameters. Since the two arguments being provided are Int instead of CGFloat, the compiler provides the misleading error:
Binary operator '*' cannot be applied to two 'Int' operands
It would be nice if the error was more like:
Binary operator '*' cannot be applied to two 'Int' operands. Expecting two 'CGFloat' operands.

Swift: NSNumber is not a subtype of Float

I tried the code from Swift Programming Language in playground and got the following error "NSNumber is not a subtype of Float", I just modified it slightly by making x and y of type Float in struct Point. What am I missing?
If I added Float type to centerX and centerY, I got error: Could not find an overload for '/' that accepts the supplied arguments.
The error message is completely unrelated to the actual error... The actual error is cannot convert Double to Float.
In Size, x and y are Double (default type of float literal) but in Point, width and height are Float. They are different types and you can't mix them without explicit conversion.
There are number of ways to fix it. You can change them all to Double or Float.
e.g.
class Point
{
var x:Double
var y:Double
}
or you can convert them to correct type by doing Float(centerX)
ps: can you post the code next time so I can change it without retype them