I realize that there are many questions here concerning converting MIDI ticks to milliseconds (ex: How to convert midi timeline into the actual timeline that should be played, Midi Ticks to Actual PlayBack Seconds !!! ( Midi Music), Midi timestamp in seconds) and I have looked at them all, tried to implement the suggestions, but i am still not getting it.
(Did I mention I am a little "math phobic")
Can anyone help me work a practical example? I am using the Bass lib from un4seen. I have all the data I need - I just don't trust my calculations.
Bass Methods
Tick
// position of midi stream
uint64_t tick = BASS_ChannelGetPosition(midiFileStream, BASS_POS_MIDI_TICK)
PPQN
//The Pulses Per Quarter Note (or ticks per beat) value of a MIDI stream.
float ppqn;
BASS_ChannelGetAttribute(handle, BASS_ATTRIB_MIDI_PPQN, &ppqn);
Tempo
//tempo in microseconds per quarter note.
uint32_t tempo = BASS_MIDI_StreamGetEvent( midiFileStream, -1, MIDI_EVENT_TEMPO);
My Attempt at Calculating MS value for tick:
float currentMilliseconds = tick * tempo / (ppqn * 1000);
The value I get appears correct but I don't have any confidence in it since I am not quite understanding the formula.
printf("tick %llu\n",tick);
printf("ppqn %f\n",ppqn);
printf("tempo %u\n",tempo);
printf("currentMilliseconds %f \n", currentMilliseconds);
Example output:
tick 479
ppqn 24.000000
tempo 599999
currentMilliseconds 11974.980469
Update
My confusion continues but based on this blog post I think I have the code right – at least the output seems accurate. Conversely, the answer provided by #Strikeskids below yields different results. Maybe I have an order of operations problem in there?
float kMillisecondsPerQuarterNote = tempo / 1000.0f;
float kMillisecondsPerTick = kMillisecondsPerQuarterNote / ppqn;
float deltaTimeInMilliseconds = tick * kMillisecondsPerTick;
printf("deltaTimeInMilliseconds %f \n", deltaTimeInMilliseconds);
.
float currentMillis = tick * 60000.0f / ppqn / tempo;
printf("currentMillis %f \n", currentMillis);
Output:
deltaTimeInMilliseconds 11049.982422
currentMillis 1.841670
Tempo is in beats per minute. Because you want to be getting a time, you should have it in the denominator of your fraction.
currentTime = currentTick * (beats / tick) * (minutes / beat) * (millis / minute)
millis = tick * (1/ppqn) * (1/tempo) * (1000*60)
to use integer arithmetic efficiently do
currentMillis = tick * 60000 / ppqn / tempo
This works:
float kMillisecondsPerQuarterNote = tempo / 1000.0f;
float kMillisecondsPerTick = kMillisecondsPerQuarterNote / ppqn;
float deltaTimeInMilliseconds = tick * kMillisecondsPerTick;
printf("deltaTimeInMilliseconds %f \n", deltaTimeInMilliseconds);
Related
in my app i need to submit the time to the game center and i need to show that in Elapsed Time - To the hundredth of a second format.
00:00:00.00
this is the format i want to show in leader board.
In my app im getting the time in following format
ss.SS
ss = seconds
SS = hundredth of a second
i converted the value to double before send it to the game center
double newScoreDouble = [newScore doubleValue];
But when i sending the double score to the game center it asking me to convert it to int64_t format. But when i convert it to that format it loses some part of the double value.
double intPart = 0;
double fractPart = modf(newScoreDouble, &intPart);
int isecs = (int)intPart;
int min = isecs / 60;
int sec = isecs % 60;
int hund = (int) (fractPart * 100);
int64_t time_to_send_through_game_center = min*6000 + (sec*100 + hund);
this is the way i convert double to int64_t
Can any one say how to send whole double value to the game center and display it in Elapsed Time - To the hundredth of a second format.
Thanks
I've done this before. When you're recording a score in the to the hundredth of a second format. You would multiply your seconds with a hundred before submitting.
So let's say the user scored 1minute 44 seconds 300 milliseconds : 1:44:30 = 104.3 seconds. Then you would set your value property of GKScore object equal to 104.3 * 100 = 10430 ,and submit it like that.
Give it a try :)
I’m using this equation to convert steps to estimated calories lost.
I now need to do the opposite and convert total calories to estimated steps.
This is the equation I’m using for steps to calories:
+(CGFloat) totalCalories:(NSUInteger)TotalStepsTaken weight:(CGFloat)PersonsWeight{
CGFloat TotalMinutesElapsed = (float)TotalStepsTaken / AverageStepsPerMinute; //Average Steps Per Minute is Equal to 100
CGFloat EstimatedCaloriesBurned = PersonsWeight * TotalMinutesElapsed * Walking3MphRate; //Walking3MphRate = 0.040;
return EstimatedCaloriesBurned;
}
It’s written in Objective-C, but I tried to make it as readable as possible.
All calculations are being done over a 1 hour period.
Thank you for you help.
Here's the algebra, I don't know Obj-C syntax...
Steps = round( Calories * StepsPerMinute / (Weight * WalkRate) );
I have the following code, and wanted to get other set of eyes to make sure I have written the right code to calculate the frame rate of a scene. Could you please chime in?
This is written for the iPad using SDK 3.2.
thanks!
- (void)drawView:(id)sender
{
mach_timebase_info_data_t timer;
mach_timebase_info(&timer);
uint64_t t1 = mach_absolute_time();
[renderer render];
uint64_t delta = mach_absolute_time() - t1;
delta *= timer.numer;
delta /= timer.denom;
NSLog(#"%lld ms: %.2f FPS", delta, 1000000000.0f/delta);
}
In case you want to measure the time spent rendering OpenGL, this won't work. OpenGL operations are processed in parallel and will not affect timing on the CPU. You can profile the time it takes to issue the OpenGL calls, but you won't be able to see how long it took them to finish.
This is unfortunate, but it makes sense. This is probably the reason why everyone's just eying their framerate: if the GPU can't finish processing in time, your CPU gets blocked and your timer (most likely CADisplayLink) will not fire "in time".
You may want to look into (expensive) profiling tools like gDEBugger, but I'm not sure they work on iOS.
I use CFAbsoluteTime to compute the frame duration in an openGL app. I stopped using mach_time because the results were not reliable.
- (void)update {
// Compute Frame Duration
static CFAbsoluteTime sPreviousTime = 0;
const CFAbsoluteTime newTime = CFAbsoluteTimeGetCurrent();
const CFAbsoluteTime deltaTime = newTime - sPreviousTime;
sPreviousTime = newTime;
float frameDuration = deltaTime;
// keep frameDuration in [0.01 ; 0.5] seconds
if (frameDuration > 0.5f) {
frameDuration = 0.5f;
} else if (frameDuration < 0.01f) {
frameDuration = 0.01f;
}
[self tick:frameDuration]; // use frameDuration to do something every frame
}
Short answer : yes what you are doing is correct.
Longer answer : to get the time in seconds for a delta between two mach_absolute_time calls, you need to do the following:
// I do this once at launch.
mach_timebase_info_data_t timer;
mach_timebase_info( &timer );
// Start time.
uint64_t t1 = mach_absolute_time( );
// Do activity.
// End time.
uint64_t t2 = mach_absolute_time( );
// Calculate delta.
uint64_t delta = t2 - t1;
// Use denom/numer from timer.
delta *= timer.numer;
delta /= timer.denom;
// Convert nanoseconds to seconds.
float secondsElapsed = ( float )( delta / 1000000000.0 );
Of course, if you want a FPS, from you need the inverse of the seconds:
1.0f / secondsElapsed;
In your case, instead of doing:
float secondsElapsed = ( float )( delta / 1000000000.0 );
You are doing:
float inverseSecondsElapsed = ( float )( 1000000000.0 / delta );
So you do indeed get the FPS as intended, so all should work as intended.
I'm asking them at 50Hz / 50 times per second for data. When I suddenly flip the device on the x-axis by 90 degrees while the device was flat on a table with display facing up bevore, the values move pretty slowly to the "target" value for that position.
Now the weird thing is: If I increase the measurement-rate, the value will move faster to that new value upon suddenly flipping the device by 90 degrees. But if I just ask once per second for the new value, it take's very long until the value reaches the target. What can be the reason for this?
I don't do any kind of data aggregation, and don't accumulate anything. I just do some simple filtering to get rid of the noise. My method looks like this:
- (void)accelerometer:(UIAccelerometer *)accelerometer didAccelerate:(UIAcceleration *)acceleration {
// Use a basic low-pass filter to only keep the gravity in the accelerometer values for the X and Y axes
// accelerationX is an instance variable
accelerationX = acceleration.x * 0.05 + accelerationX * (1.0 - 0.05);
// round
int i = accelerationX * 100;
float clippedAccelerationValue = i;
clippedAccelerationValue /= 100;
[self moveViews:clippedAccelerationValue];
}
later on, in my -moveViews: method, I do this:
-(IBAction)moveSceneForPseudo3D:(float)accelerationValue {
if(fabs(lastAccelerationValue - accelerationValue) > 0.02) { // some little treshold to prevent flickering when it lays on a table
float viewAccelerationOffset = accelerationValue * 19 * -1;
newXPos = initialViewOrigin + viewAccelerationOffset;
myView.frame = CGRectMake(newXPos, myView.frame.origin.y, myView.frame.size.width, myView.frame.size.height);
lastAccelerationValue = accelerationValue;
}
}
As a result, of the device gets turned 90 degrees on the x-achsis, or 180 degrees, the view just moves pretty slowly to it's target position. I don't know if that's because of the physics of the accelerometers, or if it's a bug in my filtering code. I only know that there are fast paced games where the accelerometers are used for steering, so I almost can't imagine that's a hardware problem.
This line:
accelerationX = acceleration.x * 0.05 + accelerationX * (1.0 - 0.05);
is a low-pass filter, which works by computing a moving average of the x acceleration. In other words, each time that callback is called, you're only moving the accelerationX by 5% towards the new accelerometer value. That's why it takes many iterations before accelerationX reflects the new orientation.
What you should do is increase the 0.05 value, to say 0.2. I'd make a global #define and play around with different values along with different refresh rates.
I am trying to display minutes and seconds based on a number of seconds.
I have:
float seconds = 200;
float mins = seconds / 60.0;
float sec = mins % 60.0;
[timeIndexLabel setText:[NSString stringWithFormat:#"%.2f , %.2f", mins,seconds]];
But I get an error: invalid operands of types 'float' and 'double' to binary 'operator%'
And I don't understand why... Can someone throw me a bone!?
A lot of languages only define the % operator to work on integer operands. Try casting seconds and mins to int before you use % (or just declare them int in the first place). The constant values you use will also need to be int (use 60 instead of 60.0).
As others have pointed out, you should be using integers. However, noone seems to have spotted that the result will be incorrect. Go back and have another look at modulo arithmetic, and you'll realize you should be doing
int seconds = 200;
int mins = seconds / 60;
int sec = seconds % 60;
Note the last line, seconds % 60 rather than mins % 60 (which will return the remainder of the minutes divided by 60, which is the number of minutes to the hour, and completely unrelated to this calculation).
EDIT
doh, forgot the ints... :)
The 60.0 forces a conversion to double
try:
float seconds = 200;
float mins = seconds / 60;
float sec = mins % 60;
Use ints instead. At least in your example, seems like they're enough (it will also be faster and clearer).
Also, in this case you would get 3.3333... mins, and not 3 minutes as expected. You could use Math.ceil(x) if you need to work with floats.
Do like this:
float seconds = 200.5;
float mins = floor(seconds / 60.0);
float sec = seconds - mins * 60.0;