Hello guys I am trying to model a self balancing segway robot in Simscape multibody, but there is a problem that I can't see the effect of gravity on my model as I run it. I have checked the direction of the gravity and the mass of my bodies but it still does not work. The inputs of the system are the revolute joints torques which are going to be connected to a controller.
The simscape model
The robot configuration
Related
I have multiple drones work in swarm formation, i made the quadcopter model and the swarm one.
Until now i have the swarm moving in a formation leader-follower and track a predefined trajectory based on PID Controller.
What i need help is how to add GPS, IMU sensors model to my model?
I'm trying to record and replay hand animations on the Hololens 2. I managed to record the tracked Transforms of Joints and use the recordings to animate given hand rigs. Now I'm trying to also record the tracked hand mesh. I'm aware of OnHandMeshUpdated in the IMixedRealityHandMeshHandler interface. Also, the following post guided me in this direction (very helpful):
How to get hand mesh data from Hololens2 without turning on Hand Mesh Visualization option
My question is: Is there a simple way to simulate hand mesh data in the Unity Editor? At the moment I don't have access to my team's Hololens, so I'm trying to figure out how to develop this feature directly in Unity.
AFAIK the OnHandMeshUpdated event is only called when there is actual mesh data on the Hololens, but not in the Editor where there are only the simulated joints of the controller, but not the hand mesh.
Any suggestions are welcome!
To simulate hand mesh input, you can use the RiggedHandVisualizer to control a SkinnedMesh built with hand joints data to visualize the hands, and it can work with InputSimulation in the Unity editor. You can find an example in the RiggedHandVisualizer scene under: MRTK/Examples/Experimental/RiggedHandVisualizer/Scenes, and more detail please seeRigged Hand Visualizer [Experimental]
I am trying to model the well-known ball and beam problem with Modelica. Now I have struggle to model the ball rolling down an inclined plane depending on the angle. I intended to use the MultiBody libraray. Has anybody an idea how to handle that? Or has anybody dealt with a similar problem?
For hints and reference you might take a look at the simple vehicle model in the MSL translational package:
Modelica.Mechanics.Translational.Components.Vehicle
Check out how the inclined plane is implemented with idealRollingWheel to calculate gravity based acceleration on the vehicle.
Slight modification of that should deliver what you require for your tilted plane.
MultiBody provides a Wheel as well, but since you are investigating a 2D scenario keeping things simple might be helpful.
I want to create an explosion particle system, but I'm not sure how can I do it. I was thinking create a fire particle system with emitter shape being an Sphere and after that just increasing sphere radius, but I don't know how can I animate it's size. Does anyone tell me how can I do that? Or does anyone have a better idea?
Emitter systems for particles are setting initial particle directions, and the rate they'll move at. That's generally how a visual representation of an explosion is created.
So rather than increasing the size of the emitter source to present an explosion, the dissemination of the particles in an outward direction creates the appearance of an explosion.
You're not limited to one batch of particles, nor one type of particles, nor just one emitter. The best explosions are a highly complex layering of different particle types with different textures, coming from different emitters at differing rates, with differing rates of decay, spin rates, colour changes and falloff in both transparency and movement speed.
Making a truly great looking explosion is a real art form and will often take a good designer days to do with a GUI and constant real time playback, especially when trying to minimise the use of textures, quads, blends, fillrate and physics.
Here's a video from Unreal Engine, wherein similar concepts and qualities as what's available in Scene Kit are used to teach the terminology. It's not a 1:1 parallel with the Scene Kit particle engine, but it's probably the best combination of visuals and simplistic explanations to help you rapidly understand what is possible and how to do it with particles.
//caveat: Unreal Engine probably has the best real time particle engine in the world at the moment, so it's a little more advanced than what's in Scene Kit.
But...the principles are essentially the same:
https://www.youtube.com/watch?v=OXK2Xbd7D9w
I am working on Optical flow based vehicle detection and tracking purely on MATLAB.
Provided that camera is in motion and object is also in motion.
Previously, a lot work is done on camera in stationary condition and object moving. Optical flow vectors can easily be determined using LUCAS-KANADE method and Horn and shunck. Sipmly taking two consecutive images results are achevied. I have done tests and acheived.
There is also simulink example viptrafficof_win available.
I need to perform optical flow based detection and tracking for camera and object both in motion. What methodology shall I pursue?
If your camera is moving, you would have to separate the camera motion (ego motion) from the motion of the objects. There are different ways of doing that. Here is a recent paper describing an approach using the orientations of optical flow vectors.