OpenGL vs. wxMathPlot for real time plotting in wxWidgets? - real-time

I am new to using wxWidgets,and am looking to do a continuous real time plot. I have done some research into real time (continuous) plotting tools available for wxWidgets.
I am most interested in using mxMathPlot for the plotting with support of mpFXYVector for feeding in points.
(I know there's Numerix Library as well, but it seems like there wasn't much documentation on it)
However, I would like to do about 100 updates a second? or 100 new points coming in a second.
Is this something that is feasible using mxMathPlot with mpFXYVector, or would this approach be too slow?
Or should OpenGL be considered?

Displaying a real time graph of data updated at 100 Hz update using wxMathPlot is feasible.
The following is a screenshot of an app using three wxMathplot instances, all showing data that is being updated at 128Hz
If you want the sort of real time graph where the old data vanish on the left as the new data appear on the right, but the amount of data shown is constant at, say, the last 10 seconds, then mpFXYVector is awkward to use. For a discussion of how to deal with this, see this answer.
However, for a first pass, this framework should get you started
... in Myframe constructor
vector <double> xs, ys;
mpFXYVector * data_layer
mpWindow * graph;
// initialize
...
// Set up 100 Hz timer
...
MyFrame::OnTimer( ... )
{
// add new data point(s)
...
// copy data into layer
data_layer.SetData(xs,ys);
// redraw
graph->Fit();
/* Note that at this point we have requested a redraw
But it hasn't taken place yet
We need to return so that the windows system can update the screen etc
We will back in 1/100 sec to do it all again.
*/
}
So, now the question is, how fast can we make this scheme go? I have experimented with a timer event handler that does nothing more than keep track of how many times it is called.
void MyFrame::OnTimer2(wxTimerEvent& )
{
static int ticks = 0;
ticks++;
static clock_t start;
if( ticks == 1 ) {
start = clock();
}
if( ticks == 1000 ) {
double duration = (double)(clock() - start) / CLOCKS_PER_SEC;
wxMessageBox(wxString::Format("1000 ticks in %f secs\n",duration));
myTimer->Stop();
return;
}
}
The fastest I can make this go, on a powerful desktop, is 66 Hz.
Does this matter?
IMHO, it does not. There is no way a human eye can appreciate a graph being updated at 100Hz. The important thing is that the graph shows data acquired at 100Hz, without any losses or lags, but the graph display does not need to be updated so frequently.
So, in the app that produced the screenshot at the top of this answer and other similar apps I have developed, the graph is being updated at 10Hz, which the wxWidgets framework has no trouble maintaining, and the data acquisition is occurring at a much higher frequency in another thread or process. Ten times per second, the graph display code copies the data that has been acquired in the meantime and updates the display.

Related

Remove Spikes from Periodic Data with MATLAB

I have some data which is time-stamped by a NMEA GPS string that I decode in order to obtain the single data point Year, Month, Day, etcetera.
The problem is is that in few occasions the GPS (probably due to some signal loss) goes boinks and it spits out very very wrong stuff. This generates spikes in the time-stamp data as you can see from the attached picture which plots the vector of Days as outputted by the GPS.
As you can see, the GPS data are generally well behaved, and the days go between 1 and 30/31 each month before falling back to 1 at the next month. In certain moments though, the GPS spits out a random day.
I tried all the standard MATLAB functions for despiking (such as medfilt1 and findpeaks), but either they are not suited to the task, either I do not know how to set them up properly.
My other idea was to loop over differences between adjacent elements, but the vector is so big that the computer cannot really handle it.
Is there any vectorized way to go down such a road and detect those spikes?
Thanks so much!
you need to filter your data using a simple low pass to get rid of the outliers:
windowSize = 5;
b = (1/windowSize)*ones(1,windowSize);
a = 1;
FILTERED_DATA = filter(b,a,YOUR_DATA);
just play a bit with the windowSize until you get the smoothness you want.

Matlab Synchronize acquisition with Color/Depth of Kinect

I'm having trouble to synchronize the color and depth image with Image acquisition ToolBox.
Currently, I'm just trying to log both streams into binary files without frame drop or losing the synchronization.
I don't try to render during my recording.
The code for the start button :
colorVid = videoinput('kinect',1); depthVid = videoinput('kinect',2);
colorVid.FramesPerTrigger = inf; depthVid.FramesPerTrigger = inf;
triggerconfig([colorVid depthVid],'manual');
iatconfigLogging(colorVid,'Video/Color.bin');
iatconfigLogging(depthVid,'Video/Depth.bin');
start([colorVid depthVid]);
pause(2); % this is to be sure both sensor are start before the trigger
trigger([colorVid depthVid]);
where the iatconfigureLogging() is from here
and the stop button just doing
stop([colorVid depthVid]);
Since the frames rate of the Kinect is 30FPS and we can't change this, I'm using FrameGrabInterval to emulate it.
But when I over like 5FPS, I can't log depth and color and keep the frames synchronized for more then 20-25 seconds. And except 1 FPS, the sync is over after 2-3 minutes and I'm looking for at least a 10-15 minutes acquisition.
I'm looking on something with like flushdata(obj,'triggers'); right now, but I don't figure it out how to keep the 30 FPS with the logging.
Thanks in advance for any one who will give something.
As far as I know you cannot synchronize the streams by triggering because they are not synchronized in hardware. I tried it and the best I could come up with was timestamping each stream and throwing away frame pairs that were too far apart in time. I noticed the classic frequency offset effect whereby the streams move in and out of synch with inverse frequency of the difference between the periods of each. The obvious disadvantage of throwing away frames this is that you get a non-continuous stream.
You can get timestamp information using
[data time] = getdata(vid,1);

Getting displacement from accelerometer data with Core Motion

I am developing an augmented reality application that (at the moment) wants to display a simple cube on top of a surface, and be able to move in space (both rotating and displacing) to look at the cube in all the different angles. The problem of calibrating the camera doesn't apply here since I ask the user to place the iPhone on the surface he wants to place the cube on and then press a button to reset the attitude.
To find out the camera rotation is very simple with the Gyroscope and Core Motion. I do it this way:
if (referenceAttitude != nil) {
[attitude multiplyByInverseOfAttitude:referenceAttitude];
}
CMRotationMatrix mat = attitude.rotationMatrix;
GLfloat rotMat[] = {
mat.m11, mat.m21, mat.m31, 0,
mat.m12, mat.m22, mat.m32, 0,
mat.m13, mat.m23, mat.m33, 0,
0, 0, 0, 1
};
glMultMatrixf(rotMat);
This works really well.
More problems arise anyway when I try to find the displacement in space during an acceleration.
The Apple Teapot example with Core Motion just adds the x, y and z values of the acceleration vector to the position vector. This (apart from having not much sense) has the result of returning the object to the original position after an acceleration. (Since the acceleration goes from positive to negative or vice versa).
They did it like this:
translation.x += userAcceleration.x;
translation.y += userAcceleration.y;
translation.z += userAcceleration.z;
What should I do to find out displacement from the acceleration in some istant? (with known time difference). Looking some other answers, it seems like I have to integrate twice to get velocity from acceleration and then position from velocity. But there is no example in code whatsoever, and I don't think that is really necessary. Also, there is the problem that when the iPhone is still on a plane, accelerometer values are not null (there is some noise I think). How much should I filter those values? Am I supposed to filter them at all?
Cool, there are people out there struggling with the same problem so it is worth to spent some time :-)
I agree with westsider's statement as I spent a few weeks of experimenting with different approaches and ended up with poor results. I am sure that there won't be an acceptable solution for either larger distances or slow motions lasting for more than 1 or 2 seconds. If you can live with some restrictions like small distances (< 10 cm) and a given minimum velocity for your motions, then I believe there might be the chance to find a solution - no guarantee at all. If so, it will take you a pretty hard time of research and a lot of frustration, but if you get it, it will be very very cool :-) Maybe you find these hints useful:
First of all to make things easy just look at one axis e.g x but consider both left (-x) and right (+x) to have a representable situation.
Yes you are right, you have to integrate twice to get the position as function of time. And for further processing you should store the first integration's result (== velocity), because you will need it in a later stage for optimisation. Do it very careful because every tiny bug will lead to huge errors after short period of time.
Always bear in mind that even a very small error (e.g. <0.1%) will grow rapidly after doing integration twice. Situation will become even worse after one second if you configure accelerometer with let's say 50 Hz, i.e. 50 ticks are processed and the tiny neglectable error will outrun the "true" value. I would strongly recommend to not rely on trapezoidal rule but to use at least Simpson or a higher degree Newton-Cotes formula.
If you managed this, you will have to keep an eye on setting up the right low pass filtering. I cannot give a general value but as a rule of thumb experimenting with filtering factors between 0.2 and 0.8 will be a good starting point. The right value depends on the business case you need, for instance what kind of game, how fast to react on events, ...
Now you will have a solution which is working pretty good under certain circumstances and within a short period of time. But than after a few seconds you will run into trouble because your object is drifting away. Now you will enter the difficult part of the solution which I failed to handle eventually within the given time scope :-(
One promising approach is to introduce something I call "synthectic forces" or "virtual forces". This is some strategy to react on several bad situations triggering the object to drift away although the device remains fixed (? no native speaker, I mean without moving) in your hands. The most troubling one is a velocity greater than 0 without any acceleration. This is an unavoidable result of error propagation and can be handled by slowing down artificially that means introducing a virtual deceleration even if there is no real counterpart. A very simplified example:
if (vX > 0 && lastAccelerationXTimeStamp > 0.3sec) {
vX *= 0.9;
}
`
You will need a combination of such conditions to tame the beast. A lot of try and error is required to get a feeling for the right way to go and this will be the hard part of the problem.
If you ever managed to crack the code, pleeeease let me know, I am very curious to see if it is possible in general or not :-)
Cheers Kay
When the iPhone 4 was very new, I spent many, many hours trying to get an accurate displacement using accelerometers and gyroscope. There shouldn't have been much concern about incremental drift as device needed only move a couple of meters at most and the data collection typically ran for a few minutes at most. We tried all sorts of approaches and even had help from several Apple engineers. Ultimately, it seemed that the gyroscope wasn't up to the task. It was good for 3D orientation but that was it ... again, according to very knowledgable engineers.
I would love to hear someone contradict this - because the app never really turned out as we had hoped, etc.
I am also trying to get displacement on the iPhone. Instead of using integration I used the basic physics formula of d = .5a * t^2 assuming an initial velocity of 0 (doesn't sound like you can assume initial velocity of 0). So far it seems to work quite well.
My problem is that I'm using the deviceMotion.and the values are not correct. deviceMotion.gravity read near 0. Any ideas? - OK Fixed, apparently deviceMotion.gravity has a x, y, and z values. If you don't specify which you want you get back x (which should be near 0).
Find this question two years later, I just find a AR project on iOS 6 docset named pARk, It provide a proximate displacement capture and calculation using Gyroscope, aka CoreMotion.Framework.
I'm just starting leaning the code.
to be continued...

Best way to code a real-time multiplayer game

I'm not sure if the term real-time is being misused here, but the idea is that many players on a server have a city producing n resources per second. There might be a thousand such cities. What's the best way to reward all of the player cities?
Is the best way a loop like such placed in an infinite loop running whenever the game is "live"? (please ignore the obvious faults with such simplistic logic)
foreach(City c in AllCities){
if(c.lastTouched < DateTime.Now.AddSeconds(-10)){
c.resources += (DateTime.Now-c.lastTouched).Seconds * c.resourcesPerSecond;
c.lastTouched = DateTime.Now;
c.saveChanges();
}
}
I don't think you want an infinite loop as that would waste a lot of CPU cycles. This is basically a common simulation situation Wikipedia Simulation Software and there are a few approaches I can think of:
A discrete time approach where you increment the clock by a fixed time and recalculate the state of your system. This is similar to your approach above except do the calculation periodically and remove the 10 seconds if clause.
A discrete event approach where you have a central event queue, each with a timestamp, sorted by time. You sleep until the next event is due and then dispatch it. E.g. the event could mean adding a single resource. Wikipedia Discrete Event Simulation
Whenever someone asks for the number of resources calculate it based on the rate, initial time, and current time. This can be very efficient when the number of queries is expected to be small relative to the number of cities and the elapsed time.
while you can store the last ticked time per object, like your example, it's often easier to just have a global timestep
while(1) {
currentTime = now();
dt = currentTime - lastUpdateTime;
foreach(whatever)
whatever.update(dt);
lastUpdateTime = currentTime;
}
if you have different systems that don't need as frequent updates:
while(1) {
currentTime = now();
dt = currentTime - lastUpdateTime;
subsystem.timer += dt
while (subsystem.timer > subsystem.updatePeriod) {// need to be careful
subsystem.timer -= subsystem.updatePeriod; // that the system.update()
subsystem.update(subsytem.updatePeriod); // is faster than the
} // system.period
// ...
}
(which you'll notice is pretty much what you were doing on a per city basis)
Another gotcha is that with different subsystem clock rates, you can get overlaps (ie ticking many subsystems the same frame), leading to inconsistent frame times which can sometimes be an issue.

iphone BPM tempo button

i want to create a button that allows the user to tap on it and thereby set a beats per minute. i will also have touches moved up and down on it to adjust faster and slower. (i have already worked out this bit).
what are some appropriate ways to get the times that the user has clicked on the button to get an average time between presses and thereby work out a tempo.
Overall
You best use time() from time.h instead of an NSDate. At the rate of beats the overhead of creating an NSDate could result in an important loss of precision.
I believe time_t is guaranteed to be of double precision, therefore you're safe to use time() in combination with difftime().
Use the whole screen for this, don't just give the user 1 small button.
Two idea
Post-process
Store all times in an array.
Trim the result. Remove elements from the start and end that are more than a threshold from the average.
Get the average from the remaining values. That's your speed.
If it's close to a common value, use that.
Adaptive
Use 2 variables. One is called speed and the other error.
After the first 2 beats calculate the estimated speed, set error to speed.
After each beat
queue = Fifo(5) # First-in, first-out queue. Try out
# different values for the length
currentBeat = now - timeOflastBeat
currentError = |speed - currentBeat|
# adapt
error = (error + currentError) / 2 # you have to experiment how much
# weight currentError should have
queue.push(currentBeat) # push newest speed on queue
# automatically removes the oldest
speed = average(queue)
As soon as error gets smaller than a certain threshold you can stop and tell the user you've determined the speed.
Go crazy with the interface. Make the screen flash whenever the user taps. Extra sparks for a tap that is nearly identical to the expected time.
Make the background color correspond to the error. Make it brighter the smaller the error gets.
Each time the button is pressed, store the current date/time (with [NSDate date]). Then, the next time it's pressed, you can calculate the difference with -[previousDate timeIntervalSinceNow] (negative because it's subtracting the current date from the previous), which will give you the number of seconds.