Calculating Equation of Center in Swift - swift

I am working on a pet project that involves calculating sunrise / sunset data. I am struggling with implementing the following formula in Swift:
Equation of Center:
C = (1.9148 * sin(meanSolarAnomaly)) + (0.0200 *
sin(2 * meanSolarAnomaly)) + (0.0003 * sin(3 * meanSolarAnomaly))
Here is the answer I should be getting for my given Lat/Lon:
C = 1.9148 * sin(18.30143135945) + 0.0200 * sin(2 * 18.30143135945) +
0.0003 * sin(3 * 18.30143135945) = 0.61344892821988
Here is my code, which is not giving the correct value as the final value:
// meanSolarAnomaly has a value of 18.30143135945036 at this point
let center = (1.9148 * sin(meanSolarAnomaly)) + (0.0200 * sin(2*meanSolarAnomaly)) + (0.0003 * sin(3*meanSolarAnomaly))
My code says center = -1.015867439183884 while the correct center = 0.61344892821988
I have double and triple checked the equation but I can't seem to spot my error. I am hoping it is a simple syntax mistake but will be embarrassed if it is...
I am working from the equation and answers provided here.
EDIT Here is the complete code:
//: Playground - noun: a place where people can play
import UIKit
//calculator to determine what time of day Sunset / Sunrise will occur
func jdFromDate(date : NSDate) -> Double {
let JD_JAN_1_1970_0000GMT = 2440587.5
return JD_JAN_1_1970_0000GMT + date.timeIntervalSince1970 / 86400
}
func dateFromJd(jd : Double) -> NSDate {
let JD_JAN_1_1970_0000GMT = 2440587.5
return NSDate(timeIntervalSince1970: (jd - JD_JAN_1_1970_0000GMT) * 86400)
}
let julianOffset = 2451545 as Double
let refDateFormatter = NSDateFormatter()
refDateFormatter.dateFormat = "MM-dd-yyyy"
let today = NSDate()
let julianDaysToToday = round(jdFromDate(today))
//get the lat/lon variables set (Tampa in example)
let lat = 27.9681
let lon = 82.4764
//now we need to calculate julian cycle
let nRaw = (julianDaysToToday - julianOffset - 0.0009) - (lon / 360)
let n = round(nRaw)
//n now contains the julian cycle
//next we must calculate the julian date of solar noon (approximately)
//J* = 2451545 + 0.0009 + (lw/360) + n
let jSolarNoon = julianOffset + 0.0009 + (lon/360) + n
//next calculate the mean solar anomaly
//M = [357.5291 + 0.98560028 * (J* - 2451545)] mod 360
let meanSolarAnomaly = (357.5291 + 0.98560028 * (jSolarNoon - julianOffset)) % 360
//next calculate the equation of center
let center = (1.9148 * sin(meanSolarAnomaly)) + (0.0200 * sin(2*meanSolarAnomaly)) + (0.0003 * sin(3*meanSolarAnomaly))
//Now, using Center and Mean, calculate the ecliptical longitude of the sun.
//λ = (M + 102.9372 + C + 180) mod 360
let eclLonOfSun = (meanSolarAnomaly + 102.9372 + center + 180) % 360
//now we can finally get an accurate julian date for solar noon
let jTransit = jSolarNoon + (0.0053 * sin(meanSolarAnomaly)) - (0.0069 * sin(2 * eclLonOfSun))
//To calculate the hour angle we need to find the declination of the sun
//δ = arcsin( sin(λ) * sin(23.45) )
let declinationOfSun = asin(sin(eclLonOfSun) * sin(23.45))
//now calculate the hour angle
//H = arccos( [sin(-0.83) - sin(ln) * sin(δ)] / [cos(ln) * cos(δ)] )
let hourCosNum = sin(-0.83) - sin(lat) * sin(declinationOfSun)
let hourDenom = cos(lat)*cos(declinationOfSun)
let hourAngle = acos(hourCosNum)/hourDenom
//time to go back through the approximation again using the hour angle
let finalJulianApproximation = 2451545 + 0.0009 + ((hourAngle + lon)/360) + n
//The values of M and λ from above don't really change from solar noon to sunset, so there is no need to recalculate them before calculating sunset.
let jSet = finalJulianApproximation + (0.0053 * sin(meanSolarAnomaly)) - (0.0069 * sin(2*eclLonOfSun))
let sunset = dateFromJd(jSet)

As #kennytm has suggested, mean anomaly (of the sun or anything else) is an angle. Angles in Swift (and C, from which the maths libraries come) are all radians, whilst astronomers talk in degrees. Here is your code in Playground:
var meanSolarAnomaly = 18.30143135945036
var c = (1.9148 * sin(meanSolarAnomaly)) + (0.0200 * sin(2 * meanSolarAnomaly)) + (0.0003 * sin(3 * meanSolarAnomaly))
// = -1.01586743918389 - wrong answer
meanSolarAnomaly = meanSolarAnomaly * M_PI / 180.0
// Convert it to radians
c = (1.9148 * sin(meanSolarAnomaly)) + (0.0200 * sin(2 * meanSolarAnomaly)) + (0.0003 * sin(3 * meanSolarAnomaly))
// = 0.6134489282198807 - right answer

Related

How to modify this code to return Geopoint

I would like this code to return a newly constructed geopoint.
I need this,
GeoPoint prjTest=new GeoPoint(vxi+x,vyi+y);
to stick somewhere and return prjTest. I'm new to programming and I don't know well synthax.I tried many things, I can keep guessing for a long time. Please help. Thanks.
public class ProjectileTest
{
public ProjectileTest(float vi, float angle) /** renamed velocity -> vi */
{
//Starting location
double xi = 0, yi = 100;
final double gravity = -9.81;
//timeSlice declares the interval before checking the new location.
double timeSlice = 0.001; /** renamed time -> timeSlice */
double totalTime = 0; /** renamed recordedTime -> totalTime */
double vxi = vi * Math.cos(Math.toRadians(angle)); /** renamed xSpeed -> vxi */
double vyi = vi * Math.sin(Math.toRadians(angle)); /** renamed ySpeed -> vyi */
//This (secondsTillImpact) seems to give a very accurate time whenever the angle is positive.
double secondsTillImpact = Math.sqrt(2 * yi / -(gravity));
/** Not sure I agree. Does this formula take into account the upward motion
* of the projectile along its parabolic arc? My suspicion is that this
* formula only "works" when the initial theta is: 180 <= angle <= 360.
*
* Compare with the result predicted by quadratic(). Discarding the zero
* intercept which can't work for us (i.e. the negative one, because time
* only moves forward) leaves us with an expected result very close to the
* observed result.
*/
double y;
double x;/** Current position along the y-axis */
do {
// x = x + (xSpeed * time);
x = vxi * totalTime; /** Current position along the x-axis */
// y = y + (ySpeed * time);
y = yi + vyi * totalTime + .5 * gravity * (totalTime * totalTime);
// ySpeed = ySpeed + (gravity * time);
double vy = vyi + gravity * totalTime; /** Current velocity of vector y-component */
System.out.println("X: " + round2(x) + " Y: " + round2(y) + " YSpeed: " + round2(vy));
totalTime += timeSlice;
}
while (y > 0);
////////////////////////////++++++++ GeoPoint prjTest=new GeoPoint(vxi+x,vyi+y);
System.out.println("Incorrectly expected seconds: " + secondsTillImpact + "\nResult seconds: " + totalTime);
quadratic((.5 * gravity), vyi, yi);
}
public double round2(double n) {
return (int) (n * 100.0 + 0.5) / 100.0;
}
public void quadratic(double a, double b, double c) {
if (b * b - 4 * a * c < 0) {
System.out.println("No roots in R.");
} else {
double dRoot = Math.sqrt(b * b - 4 * a * c); /** root the discriminant */
double x1 = (-b + dRoot) / (2 * a); /** x-intercept 1 */
double x2 = (-b - dRoot) / (2 * a); /** x-intercept 2 */
System.out.println("x-int one: " + x1 + " x-int two: " + x2);
}
}
}

Flutter : Calculate destination point from current location and bearing

I'm looking to draw a line on a map from current user position with his bearing/heading.
But the line always have the same direction even if I rotate the phone to another direction. I wonder if I miss something in my calculation.
final la1 = userPosition.latitude;
final lo1 = userPosition.longitude;
const r = 6367; // earth radius
const d = 40; // distance
const dist = d / r;
final bearing = vector.radians(direction.value);
final la2 =
asin(sin(la1) * cos(dist) + cos(la1) * sin(dist) * cos(bearing));
final lo2 = lo1 +
atan2(sin(bearing) * sin(dist) * cos(la1),
cos(dist) - sin(la1) * sin(la2));
return LatLng(la2, lo2);
As you can see in the screenshots bellow, I create 2 lines with a different bearing (check map orientation), but they look the same.
import 'dart:math';
import 'package:google_maps_flutter/google_maps_flutter.dart';
import 'package:vector_math/vector_math.dart';
LatLng createCoord(LatLng coord, double bearing, double distance) {
var radius = 6371e3; //meters
var delta = (distance) / radius; // angular distance in radians
var teta = radians(bearing);
var phi1 = radians(coord.longitude);
var lambda1 = radians(coord.latitude);
var phi2 = asin(sin(phi1) * cos(delta) + cos(phi1) * sin(delta) * cos(teta));
var lambda2 = lambda1 +
atan2(sin(teta) * sin(delta) * cos(phi1),
cos(delta) - sin(phi1) * sin(phi2));
lambda2 = (lambda2 + 3 * pi) % (2 * pi) - pi; // normalise to -180..+180°
return LatLng(degrees(lambda2), degrees(phi2)); //[lon, lat]
}

simd_quatF to euler angle

Im trying to convert my quats to euler, but out of x/y/z components, only my X has accurate value and y/z is incorrect :- ( can any1 have a look/help ?
func quatToEulerAngles(_ quat: simd_quatf) -> SIMD3<Double>{
var angles = SIMD3<Double>();
let qfloat = quat.vector
let q = SIMD4<Double>(Double(qfloat.x),Double(qfloat.y),Double(qfloat.z), Double(qfloat.w))
// roll (x-axis rotation)
let sinr_cosp : Double = 2.0 * (q.w * q.x + q.y * q.z);
let cosr_cosp : Double = 1.0 - 2.0 * (q.x * q.x + q.y * q.y);
angles.x = atan2(sinr_cosp, cosr_cosp);
// pitch (y-axis rotation)
let sinp : Double = 2 * (q.w * q.y - q.z * q.x);
if (abs(sinp) >= 1){
angles.y = copysign(Double.pi / 2, sinp); // use 90 degrees if out of range
}
else{
angles.y = asin(sinp);
}
// yaw (z-axis rotation)
let siny_cosp : Double = 2 * (q.w * q.z + q.x * q.y);
let cosy_cosp : Double = 1 - 2 * (q.y * q.y + q.z * q.z);
angles.z = atan2(siny_cosp, cosy_cosp);
return angles;
}
Wiki example converted to swifht.
TIA
My solution would be to let the (SceneKit) library do it:
func quatToEulerAngles(_ quat: simd_quatf) -> SIMD3<Float>{
let n = SCNNode()
n.simdOrientation = quat
return n.simdEulerAngles
}
I took a look at and converted it to Swift,
https://www.euclideanspace.com/maths/geometry/rotations/conversions/quaternionToEuler/
It works for me.
func quatToEulerAngles(_ quat: simd_quatf) -> SIMD3<Float>{
var angles = SIMD3<Float>();
let qfloat = quat.vector
// heading = x, attitude = y, bank = z
let test = qfloat.x*qfloat.y + qfloat.z*qfloat.w;
if (test > 0.499) { // singularity at north pole
angles.x = 2 * atan2(qfloat.x,qfloat.w)
angles.y = (.pi / 2)
angles.z = 0
return angles
}
if (test < -0.499) { // singularity at south pole
angles.x = -2 * atan2(qfloat.x,qfloat.w)
angles.y = -(.pi / 2)
angles.z = 0
return angles
}
let sqx = qfloat.x*qfloat.x;
let sqy = qfloat.y*qfloat.y;
let sqz = qfloat.z*qfloat.z;
angles.x = atan2(2*qfloat.y*qfloat.w-2*qfloat.x*qfloat.z , 1 - 2*sqy - 2*sqz)
angles.y = asin(2*test)
angles.z = atan2(2*qfloat.x*qfloat.w-2*qfloat.y*qfloat.z , 1 - 2*sqx - 2*sqz)
return angles
}

Shortest distance from CGPoint to segment

I've been trying to implement Douglas-Peucker algorithm into my code and I'm able to translate pseudocode into Swift, except for the shortestDistanceToSegment function. Only Swift version I could find was answered here but I don't understand what that actually does.
I need a function that gets three points as arguments (point and both ends of line) and returns the shortest distance between a CGPoint and a line segment. Some explanation about what (and why) the code does would great but not necessary.
Answer from https://stackoverflow.com/a/27737081/535275 w/ variables renamed & some comments added:
/* Distance from a point (p1) to line l1 l2 */
func distanceFromPoint(p: CGPoint, toLineSegment v: CGPoint, and w: CGPoint) -> CGFloat {
let pv_dx = p.x - v.x
let pv_dy = p.y - v.y
let wv_dx = w.x - v.x
let wv_dy = w.y - v.y
let dot = pv_dx * wv_dx + pv_dy * wv_dy
let len_sq = wv_dx * wv_dx + wv_dy * wv_dy
let param = dot / len_sq
var int_x, int_y: CGFloat /* intersection of normal to vw that goes through p */
if param < 0 || (v.x == w.x && v.y == w.y) {
int_x = v.x
int_y = v.y
} else if param > 1 {
int_x = w.x
int_y = w.y
} else {
int_x = v.x + param * wv_dx
int_y = v.y + param * wv_dy
}
/* Components of normal */
let dx = p.x - int_x
let dy = p.y - int_y
return sqrt(dx * dx + dy * dy)
}

Trajectory Prediction in Sprite Kit

I'm trying to predict where an object (ball) will move using the following formula
Calculations for t = 1 second
y = VelocityY * t + 0.5*Gravity*t2
x = VelocityX * t
code below:
+(CGVector) getTrajectoryPointWithInitialPosition:(CGVector) initialPosition andInitialVelocity:(CGVector) initialVelocity andSteps: (CGFloat)n andSceneLayer: (SKScene*) sceneLayer
{
// Put data into correct units
CGFloat t = 1.0 / 60.0;
// m/s
CGVector stepVelocity = CGVectorMake(t * initialVelocity.dx, t * initialVelocity.dy);
// m/s^2
CGVector stepGravity = CGVectorMake(t * t * sceneLayer.physicsWorld.gravity.dx, t * t * sceneLayer.physicsWorld.gravity.dy);
initialPosition = CGVectorMake(initialPosition.dx + n * stepVelocity.dx,
initialPosition.dy + n * stepVelocity.dy + 0.5 * (n*n+n) * stepGravity.dy);
return CGVectorMake(initialPosition.dx + n * stepVelocity.dx,
initialPosition.dy + n * stepVelocity.dy + 0.5 * (n*n) * stepGravity.dy);
}
I then launch the ball (stationary/non-dynamic) using the following
CGVector aVelocity = CGVectorMake(initialVelocity.dx*17.5, initialVelocity.dy*17.5);
[ball.physicsBody setVelocity:aVelocity];
What I can't figure out is:
initialVelocity for the ball and the trajectory prediction is the same. If it's the same, why does multiplying initialVelocity for the ball by 17.5 gets the ball movement and the predicted trajectory obtained from above to match up. It looks like it's following the predicted path but I don't understand why multiplying the balls velocity by 17.5 makes the ball
![Before multiplying by 17.5] http://i.imgur.com/bKkPGmh.png - Before multiplying by 17.5
![After multiplying velocity by 17.5] http://i.imgur.com/Ae7sY4i.png - After multiplying by 17.5