I have, most of the time, 50+ FPS, but when I load resources(background tread) it drops to 30 FPS. I want to have constant FPS, 30 or even 20, it's not a problem for me. What is the best way to make FPS constant?
You should be using time-based redrawing, instead of frame based drawing. Michael Daley's book has some excellent info on this.
Also, try loading as many of your resources as possible in a spritesheet.
Not really sure if there is an established method, but frame rate in this case is the frequency at which you redraw. 20 FPS is a 50 millisecond redraw rate. One method would be to only redraw if the time between the last redraw and now is >=50 ms.
Related
To my understanding of this diagram, Update() is called strictly once per game cycle, while FixedUpdate() is called 0 to multiple times when Unity feels like to do physics. Is it true?
What I'm not sure is, are physics steps commonly behind current time, or ahead of current time? In other words, how is Unity's decision made for doing physics or not, like while (currentTime - previousPhysicsUpdateTime >= fixedDeltaTime) or like while (currentTime > previousPhysicsUpdateTime)?
Or neither of the above?
It is a bit like to opposite. Physics is done regularly on a fixed time basis. That means that FixedUpdate is called on a fixed time basis. It is the Update, that can vary depending on the fps of the game (60 fps, Update was called 60 times per second).
Now, we usually thing from the Update point of view. Meaning that if you have a high fps, maybe you can have two Update calls before a FixedUpdate happens. In the same way, if you fps drops you have less Updates, but still the same amount of FixedUpdate, so you will have several FixedUpdated between two Updates.
So for example, let's say FixedUpdate happens 50 times per seconds (that's the default in unity). It you game runs at 60 fps, It means most of the time you will have one FixedUpdate and one Update call. Occasionnaly two Updates will follow (because Update is a bit more frequent). If you reach 100 fps, it is sure that you will always have at least two Update calls before a FixedUpdate.
I suggest you look at the FixedUpdate Doc if this is not clear enough
I have implemented a dynamical system in NetLogo using rk4, which makes the updation extremely slow. I can't observe anything when see the model. Is there any efficent way to record the simulation.
I know not much about graphics what does frame rate mean, does increasing that be of any help?
From what I understand how NetLogo does it is it records each update of view and plays at a specified frame rate. But in my case I want the frame rate to extremely high like about 1k-10k frames per sec. So what I am trying to do, is depending upon the frame rate, make Netlogo record lesser snapshots of the view.
I don't know if I am conceptual wrong somewhere.
So essentially a correlation btw the frame rate and snapshots of view recording to discard frames which may not have such a effect in the overall video playing at such high rate.
The human eye can only perceive in the neighborhood of 50-100 frames per second, so when you say you're interested in getting "1k-10k" frames per sec, I don't understand that part. If you mean you want "1k-10k" ticks per second, that would make more sense.
A "frame" is just one of the still images that make up a movie.
If you record a movie using movie-grab-view or export-view, you're free to call those primitives as often or as seldom as you like, according to any scheme you like. For example, instead of grabbing a frame every tick, you might only grab a frame every 10th, or every 100th tick. The resulting movie will go by 10 or 100 times as fast, since it will contain 10x or 100x fewer frames. Using this technique, you can get as high a ticks-per-second number as you want.
Example code:
repeat 1000 [
repeat 10 [ go ]
movie-grab-view
]
I realized that these are two different things:
Drawing, and getting it on screen.
So while you may draw in every single call from CADisplayLink at a rate of 60 FPS, if your drawing operations take slightly longer than 1/60 seconds you end up with 30 FPS in theory, because you're missing out every other chance to get through the render pipeline.
OK; Knowing this, it seems nonsense to remember the start NSTimeInterval and incrementing a frame counter in the run loop, then checking at the end if a second has passed and calculate the FPS for the last passed second.
I want a way to actually get the true FPS value from OpenGL ES on screen. I looked into instruments in Xcode 3.2.6 but couldn't find one for this. But I remember there was a way to get that FPS value. The real one.
How?
Measuring OpenGL ES performance by framerate may not be the best approach. I've taken to recording frame time myself, which seems to provide a more accurate assessment of my overall rendering performance. It's trivial to encapsulate your rendering in something like
CFTimeInterval previousTimestamp = CFAbsoluteTimeGetCurrent();
// Do your OpenGL ES frame rendering here, as well as presenting the onscreen render buffer
CFTimeInterval frameDuration = CFAbsoluteTimeGetCurrent() - previousTimestamp;
NSLog(#"Frame duration: %f ms", frameDuration * 1000.0);
to obtain rendering time. If you want it, your instantaneous framerate is the inverse of frameDuration in the above code.
Be careful to time the entire frame rendering, because the tile-based deferred renderer in iOS and other mobile devices may hide the true cost of particular rendering operations by delaying them until just before the frame is drawn to the screen.
However, if you want to obtain a less precise framerate from Instruments, you can do that using the OpenGL ES Driver instrument, which reports Core Animation Frames Per Second as one of its logged statistics:
Hi all can anyone help me out with the solution to this problem...
I have a project where I have NSTimer fire about 20 times a sec and thus using only one image(loaded programatically)produces the same image on the iPhone screen about twenty times in a second and these images fall from the top of the screen to the bottom where they are removed(more or less simulating rain fall or rain droplets).
My problem is that looking at the animation, I noticed that there are very small delays and which looks like a break, pause or small vibrations. Thus the flow isn't smooth.
Can anyone help me with the solution please.
Thanks in advance
You can check the CADisplayLink class.
Just because you ask for a timer in the UI run loop to go off at a certain rate, doesn't mean you will get called at exactly that rate or at evenly spaced intervals. You should check the time and the elapsed time "dt" inside each timer callback, and change your animation position, y + dy * dt, etc., accordingly.
Timer's arn't designed to be used for animation.
The best thing to do, is to have a thread running in an infinite loop, where you check if it's time to animate again, or just always animate (giving you a higher frame rate), but using the time elapsed as a reference for the state you are drawing.
You really shouldn't be using a timer for this. Instead you should be using the built in UIView animation methods:
animateWithDuration:animations:
animateWithDuration:animations:completion:
animateWithDuration:delay:options:animations:completion:
Take CABasicAnimation for example. How do you lower the frame rate (overhead)? Animations run smooth, but my touchesMoved method skips a beat. Want to reduce the animation frame rate so touchesMoved is not skipping movements.
You don't have any inherent control over frame rate once you start your CABasicAnimation.
Probably the best way to achieve this would be to create multiple interpolations for a single animation (i.e. if you're moving 50 px down and 50px across, do 2 x 25px each) and induce an artificial sleep in your thread. Not a perfect solution, but will perhaps achieve slightly better results that you're seeing.
Be aware that this technique will have different framerates on different CPUs, and is therefore not generally recommended. Essentially, YMMV.