Planet tilts and rotations. - pyephem

We are building an Astronomical clock that tracks the planet and moon locations in the Galilain solar system ( planets out to Saturn And moons do earth and 4 moons of Jupiter. Think if an orrery that acuately tracks time.
The clock can move to different epoch and Can move through the 12 astronomical ages. It’s 8ft in diameter, hangs from the ceiling, 24 stepper motors drive the rotations and tilts of the planets, 18.6 year moon cycle etc.
We plan to use PyEphem to identify locations of planets. We need additional data and was wondering if python can provide this data.
We need the rotation and tilt of planet (earth mars Saturn). And we need tie rotation so the part of the planet that faces the sun. Lastly we need to Jupiter’s 4 major moons location over the planet.
Does PyEphem support these addition items and if not is there any advice you can provide us.
W

Details of what PyEphem supports are available from the PyEphem website which provides detailed, searchable documentation.
This is very standard for software libraries. I am not being disingenuous when I advise you to RTFM - this is definitive material. Google and Bing should be your next port of call, especially since you have clearly defined search terms.

Alas, PyEphem does not include any models of planetary rotation, no — so the lack of any mention of it in the documentation is not an omission but the lack of that feature in the underlying libastro library.

Related

Making `ScrewJoint` more compliant to the manipulator

I am working on a simulation, which contains:
a bolt, welded uprightly to the world
a nut, connected to the bolt via ScrewJoint. The mass of a nut set to 0.02 kg, the inertia is a diagonal 1.1e-9 *I. This is configured via a .sdf file.
an iiwa manipulator, which is beside a point for now.
The problem is that the nut is very hard to manipulate and I cannot find a parameter to adjust, which could've made it more lifelike. To be more specific:
I measure the ability of force, applied tangentially in a horizontal plane to the nut, to cause the screwing motion of a joint, that joins the nut with a bolt
I'd like to have greater amount of motion at lower forces, and so far I am failing to achieve that
My interest in doing this is not idle; I am interested in more complicated simulations, which are also failing when iiwa is coming in a contact with this same joint; I've asked about those here and here. (Both answered partially). To sum up those here: when manipulator grips the nut, the nut fights the screwing in such a manner, that the schunk gripper is forced to unclasp and iiwa is thrown off-track, but the nut remains stationary.
I attach below two simpler experiments to better illustrate the issue:
1. Applying tangentially in a horizontal plane 200N force using ExternallyAppliedSpatialForce.
Graph notation: (here as well as below)
The left graph contains linear quantities (m, m/s, etc) along world's Z axis, the right graph contains angular quantities (in degrees, deg, deg/s, etc) around world's Z axis. The legend entries with a trailing apostrophe use the secondary Y-axis scale; other legend entries use the primary Y-axis scale.
Experiment summary:
This works as expected, 200 N is enough to make the nut spin on a bolt, resulting in the nut traveling vertically along the bolt for just under 1 centimeter, and spinning for over 90 degrees. Note: the externally applied force does not show up on my graph.
2. Applying tangentially in a horizontal plane force using iiwa and a simple position controller.
Experiment summary:
The force here is approximately the same as before: 70N along tz, but higher (170N) in tx and ty, though it is applied now only for a brief moment. The nut travels just a few degrees or hundredth fractions of centimeter.
The video of this unsuccessful interaction is below, the contact forces are visualized using ContactVisualizer.
Please advise me on how to make this screw_joint more compliant?
I've tried varying mass and inertia of the nut (different up to the orders of magnitude) in these experiments, this
seems to scale the contact forces, but does not affect acceleration or velocity of the nut after contact.
I like your experiment using ExternallyAppliedSpatialForce to get an idea of scales, though TBH I didn't quite get the details of this setup.
Things that caught my eye though are about scales, which you can estimate with pen and paper:
Your inertia is 1e-9 kg⋅m²?! Judging from your interaction with the iiwa I estimated a radius of 1cm and with that you'd get 2e-6 kg⋅m², three orders of magnitude larger.
A force of 200 N on a 20 grams nut would cause an acceleration of 10000 m/s². As a reference, that's 1000 times the acceleration of Earth's gravity!
Are these numbers correct? Also, if you happen to have fast interactions (do you?), you might want to estimate a time step that makes sense for your application.
Hopefully this helps!
A good thing is that I've fixed it; a bad thing: I don't understand the fix.
Let's restate a problem that I was tacking: a manipulated object reacted quite predictably to the ExternallyAppliedSpatialForce, but couldn't be moved via the contact with the manipulator.
What was done:
I've update drake from 2f340192a9dc79110410faf8a6d54a8615ddca92 (circa 22 Aug, 2022) to 42448c0af1b39f0c46f760e7ae37d77097689ad3 (circa 3 Nov, 2022)
After the update, my experimental setup broke down with assertion Actuation input port for model instance ... must be connected. [Similar to the issue raised in this question.]. My fix was like that:
bolt_n_nut_ = internal::AddAndWeldModelFrom(sdf_path, "nut_and_bolt",
plant_->world_frame(),
"bolt", X_WC, plant_);
then later in ManipulationStation::Finalize:
auto zero_torque = builder.template AddSystem<systems::ConstantVectorSource<double>>(
Eigen::VectorXd::Zero(1));
builder.Connect(zero_torque->get_output_port(),
plant_->get_actuation_input_port(bolt_n_nut_));
With changes above, a manipuland began to interact with the manipulator:
Things to note in graphs:
the distance the manipuland has moved grew from fractions of millimeter up to tens of centimeters. The video presents that a nut became manipulable.
This interaction violates the constraints of the ScrewJoint, i.e. the manipuland moves along it's axis without as much rotation

HoloLens: How to stabilize holograms at far distances

I want to place virtual objects (holograms) at far distances (20+ meters) in the HoloLens 1. However, at such distances holograms become unstable and appear to "swim" in the display. Has anyone had success with this? What worked for you?
Some potential fixes include:
Ensure 60 FPS
Adjust Stabilization Plane
Employ visual markers (vuforia)
Use static room scan (may not scale well)
For me, frame rate is not an issue. And I am using Unity 2017.4.4f1. Currently, I have a single world anchor and all objects are set relative to this anchor.
20+ meters is a lot and I am not sure if this will work good enough.
Ensuring 60 fps or at least 50/55+ is important but this wont solve the swimming at this distance. A low framerate might only cause additional swimming :)
Everything that should appear statically placed in the room should be on or very close to the stabilization plane. So what you want to avoid is having the far objects at very different distances from the user. That would otherwise cause the ones farthest off from the stabilization plane to swim.
If you only have the far away object try placing the stabilization plane at the same distance as the object, if the distances are changing a lot you can also update the stabilization plane distance at runtime to always set it to the current distance to the object.
Would be interesting to hear if it worked out :)
One more thing: If I remember correctly, objects should ideally placed directly or in close proximity to their world anchor to help stabilization.
20 metres is too far. The docs
Best practices When holograms cannot be placed at 2m and conflicts
between convergence and accommodation cannot be avoided, the optimal
zone for hologram placement is between 1.25m and 5m. In every case,
designers should structure content to encourage users to interact 1+ m
away (e.g. adjust content size and default placement parameters).

Is there a term for explorable vr scene?

Is there a tech community agreed term for a photographic (well as close as possible) scene that can be explored by walking around? Obviously, within certain limits. Say, a museum could scan a sculpture with laser and make it available on vr, 3d mesh with properly mapped textures. Is there a name for such thing? The so-called 360 VR photos definitely fall short of such detail.
I think the most common names are:
360 if it's just an image from one point containing all the angles, usually a equirectangular or cubemap texture/video. Some have stereoscopy, but it's very limited.
360 with depth it's a 360 but apart from color, it has depth information. This allows stereoscopy and some movement, but because of shadowing and problems with acquiring depth maps its almost never used. In the future AI-based filling of shadowed areas, and perhaps replacing the need for capturing depth, might make this a commonly used format.
photogrammetry if it's converted to a textured mesh, has proper depth and can be viewed from all angles (for example Vanishing of Ethan Carter - unfortunatelly 3d models from that article seem to be missing, sent them an email, maybe they'll fix it)
lightfield if it's a volume containing lots of 360 images with some kind of interpolation between them. Has proper depth but can be viewed only from the mapped volume (see Welcome To Lightfields)

Position of the sun in the sky of mars or the moon

I'm interested in designing a device that will track the position of the sun from the surface of the moon or mars.
Is there a way to compute this information within the pyephem package, or a way to derive it?
Alas, no — PyEphem can't directly generate positions from bodies other than the Earth. You might want to look at Skyfield, AstroPy, or NOVAS for that possibility.

Modeling Matlab Script with Adobe After Effects

So I am looking for some general advice. For my final project in my Computational Physics class. I have to complete the following problem.
(4.16 from Computation Physics 2nd Edition By Giordano and Nakanishi)
-Carry out a true three-body simulation in which the motions of Earth, Jupiter, and the Sun are all calculated. Since all three bodies are now in motion, it is useful to take the center of mass of the three-body system as the origin, rather than the position of Sun. We also suggest that you give Sun and initial velocity which makes the total momentum of the system exactly zero(so that the center of mass will remain fixed). Study the motion of Earth with different initial conditions. Also, try increasing the mass of Jupiter to 10, 100, and 1000 times its true mass.
My question: Is it possible to write the code for the problem above, and then import that code(or the result) into Adobe After Effects to model the three-body simulation? My teacher has expressed that if i am able to do so, he would be inclined to give me extra credit, which i desperately need.
It is possible to do some scripting in After Effects, but the language is Javascript (or rather, ExtendScript), not Matlab. I would do everything in a javascript: first calculate your solution (3 streams of position data), and use these to keyframe the corresponding layers in After Effects. You will have to learn a bit of the After Effects object model, but in that case you wont need much: the Layer object and how to keyframe its position property. You should go to the Adobe After Effects scripting forum https://forums.adobe.com/community/aftereffects_general_discussion/ae_scripting/ for some script examples.