I write this code i Swifts Playground, but the result is wrong:
import UIKit
var degree:Double = 60
var result = cos(degree)
--
The result shall be 0.5 but Playground get me the answer = -0.9524129804151563.
If I choose 30 degrees the result will be = 0.154251449887584
What is wrong??
Trigonometric functions that take angles treat values as if they are expressed in radians, not degrees. When you pass 60, you get back cosine of 60 radians, not 60 degrees. To convert degrees to radians, multiply the value by π, and divide by 180.
Related
I would like to make a rotation using 4x4 matrix in Swift, but it has unexpected behavior: 200 degrees + 45 degrees = 115 degrees, and not 245
let degree200 = Angle(degrees: 200).radians
let degree45 = Angle(degrees: 45).radians
// 200 degrees + 45 degrees
let rotationMatrix = float4x4(simd_quatf(angle: Float(degree200+degree45), axis: SIMD3<Float>(0, 1, 0)))
// it prints 115 degree, and not 245
print(Angle(radians: Double(simd_quatf(rotationMatrix).angle)).degrees)
I assume that's a typo, and you in fact meant -115 degrees? (remainder(245, 360)) When using quaternions & Matrices to express orientations, you can only expect to see values of -180 to +180 degrees when converting those values back to Euler angles.
In general it is impossible to convert back to Euler angles from either a quaternion or matrix and get the original input values back. You either store the original Euler angles and present those to the user, or you will have to have a known starting Euler value and apply an Euler filter to obtain approximately correct results.
The only correct way to get your expected result is to NOT print the value after conversion to quats:
print((degree200 + degree45). degrees)
Well I know 115 and 245 are 360. Just a guess but maybe you're rotating the wrong way?? Maybe try negative values and see what happens.
i need to calculate some expression for all angles from 0 to 90 degrees increments 10 degrees (of cause expression depends on some trigonometrical function).
It looks like:
for alpha = 0:10:90
func(alpha) = c * sin(alpha)
end
Who know how to work with degrees, tell, please
It should be:
for 0:pi/18:pi/2
I have the following calculations:
let sinX = sin(150.0) //returns -0,71487
let cosY = cos(150.0) // returns 0,699250
But the real values for sinX = 0,5 and for cosY = -0,86.
Does anybody know where is the error?
The calculation is correct. However sin and cos take their param in radians, not degrees.
In calculus and most other branches of mathematics beyond practical
geometry, angles are universally measured in radians. One radian is
equal to 180/π degrees.
To convert from radians to degrees, multiply by 180/π.
https://en.wikipedia.org/wiki/Radian
And are you sure the sin & cos methods haven't been redefined by creating or overriding the default methods. That happens in programming. If so, you might want to re-check your operation.
For example,
sin 33.35 = 0.5523 in degree
sin 33.35 = 0.9347 in radian
As Xcode gives answers by default in radian.
So is there any way to get answer in degree ???
Yes, you multiply radians by a constant, 180/pi, to get degrees. That's because there are 2 * pi radians in a complete circle of 360 degrees (so half a circle is pi radians and 180 degrees).
Just keep in mind that's a conversion you have to apply to the input of the sine function (the angle), not the output (which is a length).
The pi constant can be used by including math.h and using the M_PI symbol.
A full circle in degrees exists of 360 degrees, and a full circle in radians is 2*pi. So, to convert radians to degrees, you divide by pi and multiply by 180.
radians * 180 / M_PI
If you would convert from degrees to radians, for example to provide degrees for a animation, use this:
degrees * M_PI / 180
I am using #DEFINEs for this myself:
#define DEGREES(radians)((radians)*180/M_PI)
#define RADIANS(degree)((degree)*M_PI/180)
Simply use by saying RADIANS(degree) or DEGREES(radians) in any part of your code, where you replace the degree and radians by the degree or radian value you had. This also keeps it more readable in my opinion if you are not used to radians.
Get the value in radians, then do:
CGFloat degrees = (radians * 180) / M_PI;
M_PI is a #define located in math.h.
Fortunately, the Objective-C language is a proper superset of C, which includes multiplication and division operators.
There are two views:
viewA and viewB. Both are rotated.
The coordinate system for rotation is weird: It goes from 0 to 179,999999 or -179,99999 degrees. So essentially 179,99999 and -179,99999 are very close together!
I want to calculate how much degrees or radians are between these rotations.
For example:
viewA is rotated at 20 degrees
viewB is rotated at 30 degrees
I could just do: rotationB - rotationA = 10.
But the problem with this formula:
viewA is rotated at 179 degrees
viewB is rotated at -179 degrees
that would go wrong: rotationB - rotationA = -179 - 179 = -358
358 is plain wrong, because they are very close together in reality. So one thing I could do maybe is to check if the absolute result value is bigger than 180, and if so, calculate it the other way around to get the short true delta. But I feel this is plain wrong and bad, because of possible floating point errors and unprecision. So if two views are rotated essentially equally at 179,99999999999 degrees I might get a weird 180 or a 0 if I am lucky.
Maybe there's a genius-style math formular with PI, sine or other useful stuff to get around this problem?
EDIT: Original answer (with Mod) was wrong. would have given 180 - right answer in certain circumstances (angles 30 and -20 for example would give answer of 130, not correct answer of 50):
Two correct answers for all scenarios:
If A1 and A2 are two angles (between -179.99999 and 179.99999,
and Abs means take the Absolute Value,
The angular distance between them, is expressed by:
Angle between = 180 - Abs(Abs(A1 - A2) - 180)
Or, using C-style ternary operator:
Angle between = A1 < 180 + A2? A1 - A2: 360 + A1 - A2
Judging from the recent questions you've asked, you might want to read up on the unit circle. This is a fundamental concept in trigonometry, and it is how angles are calculated when doing rotations using CGAffineTransforms or CATransform3Ds.
Basically, the unit circle goes from 0 to 360 degrees, or 0 to 2 * pi (M_PI is the constant used on the iPhone) radians. Any angle greater than 360 degrees is the same as that angle minus a multiple of 360 degrees. For example, 740 degrees is the same as 380 degrees, which is the same as 20 degrees, when it comes to the ending position of something rotated by that much.
Likewise, negative degrees are the same as if you'd added a multiple of 360 degrees to them. -20 degrees is the same as 340 degrees.
There's no magic behind any of these calculations, you just have to pay attention to when something crosses the 0 / 360 degree point on the circle. In the case you describe, you can add 360 to any negative values to express them in positive angles. When subtracting angles, if the ending angle is less than the starting angle, you may also need to add 360 to the result to account for crossing the zero point on the unit circle.
Let's try this again:
There are two angles between A and B. One of them is
θ1 = A - B
The other is
θ2 = 360 - θ1
So just take the minimum of those two.
In addition to Brad Larson's excellent answer I would add that you can do:
CGFloat adjustAngle(angle) { return fmod(angle + 180.0, 360.0); }
...
CGFloat difference = fmod(adjustAngle(angle1) - adjustAngle(angle2), 360.0);
Take the difference, add 360, and mod by 360.