I am currently using Openfoam to run simulations that have nearly 180 time steps. The completed timesteps range from -180 to -6.
I have to generate pictures using paraview.
I have to generate picture for every 10th timestep i.e -180,-170,-160 and so on.
Is it possible to do this with paraview?
Related
In an application I'm working on, I receive (x,y) samples at a fixed rate (100 Hz) one at a time and need to detect a 4-second sequence (400 samples) with a constant gradient (with a certain tolerance), and store this gradient for later use. Each sample pair is 8 bytes long, so I would need 3200 bytes of memory if I use the standard moving window least squares regression algorithm for the purpose. My question is, is there a formula for a continuous (recursive) calculation of the gradient of the line of best fit - one incoming sample at a time, without the need to keep an array of the last 400 samples? Something in the vein of exponential moving average, where at any point in time only the latest averaged value needs to be known in order to update it with the new incoming sample.
Would appreciate any pointers to existing solutions.
I have ten measurements, containing 4 signals, with a length of 6-8 hours each at a sample rate of 10Hz. (200k-300k samples per measurement)
3 signals are the x-, y- and z-axis of an accelerometer (measuring the acceleration m/s^2). This sensor is positioned on a sphere. The sphere is inflated and deflated. The air-fluctuation in the sphere is the 4th signal.
The accelerometer values, are thus a representation of fluctuation of air in the sphere. If the sensor is positioned on top of the sphere, a good correlation is seen between air fluctuation and Z-axis of the accelerometer. The X- and Y- axis should in a perfect situation be 0, but they are not and contain noise.
I have tried many methods of combining the three signals into one, and comparing that to the air-fluctuation signal. At some points this does give a good outcomes, but when the accelerometer is not moving over one axis, the signal-noise ratio seems to be too high (e.g. when the sensor is at a 45 degree angle).
I am wondering if computer machine learning in Matlab can be used here, to automatically generate a model that makes the best fit: combine the 3 axis signals into one, that best represents the 4th signal. I assume this is what computer machine learning is about?
I can provide the signals in filtered format, the integrated signals, the angle of the sensor at a given time, full signal or just a minute of data, etc.
I have however no idea how to start and which toolbox to use to tackle this problem. Also what signals to feed into the algorithm? at what length (full vs couple of seconds/minutes)
Can you help me getting started on the computer machine learning process, to use 3 signals (and/or formatted versions of them) into one signal that closely matches the fourth signal*.
I'm not asking for full code, but what steps to take to tackle this problem. Like have a look at that toolbox, and function x.
I'm using the program Dymola in this case. As you can see in the figure, we have our temperature goal (refTemp) and we are comparing this temperature with a temperature in the system(KvvTemp). Our goal is to differentiate these temperatures, then multiplying the difference with a small number so our value will become between 0-1 before entering the intergrator. Now to my question, how is it possible for the integrator's output to be the temperature we want to send in to the system (y1)? Is there any explanation on how it is possible to set the temperature that will enter the system(y1) through the intergrator?
I am reading about applications of clustering in human motion analysis. I started out with random numbers and applied k-means clustering algorithm but I wanted to have some graphs that circle the clusters as shown in the picture. Basically, the lines represent the motion trajectory. I will appreciate ideas on how to obtain motion trajectory of a person. Application is patient monitoring where the trajectory will be used in abnormal behavior activity.
I will be using a kinect and recording the motion trajectory based on skeleton tracking. So, I will be recording the 4 quaternion values of Head, Shoulder and Torso joints and the RGBD (Red green blue Depth) that is combined as 1 value for these joints. So, a total of 4*3 + 3 = 15 time series. So, there are 15 variables. How do I convert them to represent the trajectories shown below and then apply clustering to cluster trajectories. The clusters will allow in classification.
Can somebody please show how to obtain the diagram similar to the one attached? and how do I fuse and convert the 15 time series from each person into a single trajectory.
The picture illustrates the number of clusters that are generated for the time series. Thank you in advance.
K-means is a bad fit for trajectories.
It needs to be able to compute the mean (which is why it is called "k-means"). Having a stable, sensible mean is important. But how meaningful is the mean of some time series, even if you could define it (and the series weren't e.g. of different length, and different movement speed)?
Try hierarchical clustering, and multivariate dynamic time warping.
I have created a continuous time transfer function (s-domain) with the "tf" command. Now I can use the command "bodeplot" to plot magnitude and phase characteristics.
Now I would like to plot group delay and phase delay as well (both as a function of frequency). I stumbled upon the commands "grpdelay" and "phasedelay" but they seem to work with discrete time filters only. What are the continuous time equivalents?
I have searched all over the internet with no luck so far :-(
I dont think there is a straight forward function to do this. You're probably better off by extracting the phase information for many points and then finding the derivative (multiplied by -1),