I read all the posts I found about improving performance but my problem is a bit different.
I have a simple physics-based game. no super fancy stuff.
I´ve got max. 40 nodes on the screen and a few sklabelnodes.
my code is efficient - every node which moves outside the screen gets removed from his parent and Iv'e got just 4 physicBodys at once. textures are not to big - max 250x250 which are preloaded in an atlas.
the problem is that it works for a minute or so and then it starts to stutter for 4-5 sec and then it works fine again. that stuttering doesn't appear on a certain point. sometimes it does on startup and sometimes after a few minutes.
I don´t know what to do.
I load my ad´s on another thread and otherwise there isn't anything to load.
EDIT:
NO shapeNodes.
XCode 7 + iPhone 6plus (9.3.2)
#Alessandro Ornano
my update method only handles the background images. i already tried without bg´s. same problem. i read about performance issues in iOS9
do you know something about that?
-(void)update:(NSTimeInterval)currentTime{
//BG
if (self.background1.position.x >= self.size.width*1.5) {
self.background1.position = CGPointMake(self.background2.position.x - self.background1.size.width+1, self.size.height/2);
}
if (self.background2.position.x >= self.size.width*1.5) {
self.background2.position = CGPointMake(self.background1.position.x - self.background2.size.width+1, self.size.height/2);
}
//BG CLOUDS
if (self.backgroundClouds1.position.x >= self.size.width*1.5) {
self.backgroundClouds1.position = CGPointMake(self.backgroundClouds2.position.x - self.backgroundClouds1.size.width+1, self.size.height/2);
}
if (self.backgroundClouds2.position.x >= self.size.width*1.5) {
self.backgroundClouds2.position = CGPointMake(self.backgroundClouds1.position.x - self.backgroundClouds2.size.width+1, self.size.height/2);
}
}
Related
I have this two functions that measure the elapsed time when the phone is locked or the app is in background:
func saveTimeInBackground(){
startMeasureTime = Int(Date.timeIntervalSinceReferenceDate)
}
func timeOnAppActivated(){
stopMeasureTime = Int(Date.timeIntervalSinceReferenceDate)
elapsedTime = stopMeasureTime - startMeasureTime
seconds = seconds - elapsedTime + 2
if seconds > 0 {
timerLbl.text = "time: \(seconds)"
} else {
seconds = 0
timerLbl.text = "time: \(seconds)"
}
}
and then in the viewDidLoad() i have observers that are trigger the functions when the app becomes active/inactive:
NotificationCenter.default.addObserver(self, selector: #selector(saveTimeInBackground), name: Notification.Name.UIApplicationWillResignActive, object: nil)
NotificationCenter.default.addObserver(self, selector: #selector(timeOnAppActivated), name: Notification.Name.UIApplicationDidBecomeActive, object: nil)
The problem is that when the app becomes active there are 2 seconds (approximately) of difference so i've added 2 seconds and it seems to work fine, but only if the elapsed time is > 15 seconds.
If i lock the phone and immediately unlock it the there are like 5 or more seconds that are missing. For example, if there are 50 seconds left, when i lock and immediately unlock it there are like 42 seconds left.
Can anyone please explain, what i am doing wrong?
Edit: The logic of the app is this:
It starts a match between 2 players with 60 seconds for a game. The problem is that when one of the players locks the phone the app stop to measure the time. This way if the player1 has 10 seconds left to make a move, the player2 still has 50 seconds left. I'm looking for a reliable way to calculate the time even if the player locks the phone or put the app in background.
Edit 2: I think i figured out what the problem is: I think the issue has to do with the fact that the “seconds” are Int, and the Date not and when it gets converted it’s rounded up. I didn't tested it, but when i ahve the solution i'll post the answer. Thanks all for your time!
You're relying on exact timing of notifications that aren't guaranteed to have any exact timing. There's no guarantee about when, exactly, either of those notifications will arrive, and there's nothing you can do about that. Even your two-second fix is, as you say, approximate. It'll probably be different on different models of iPhone or even at different times on the same iPhone, depending how busy iOS is when you check.
What's more, when you go into the background, you can't be certain that you'll stay there. Once in the background, iOS might decide to terminate your app at any time.
I'm not sure what the goal is here but I think you'll need to reconsider what you want to do and see if there's some other approach. Your current two-second hack will, at best, spawn a bunch of other hacks (like the 15 second threshold you mention) without ever being especially accurate. And then it'll probably all break in the next iOS update when some iOS change causes the timing to change.
I would use Date object to track game time.
func gameStart() {
gameStartDate = Date()
}
func timeOnAppActivated() {
let secondsLeft = 60 - abs(gameStartDate?.timeIntervalSinceNow ?? 0)
if secondsLeft > 0 {
timerLbl.text = "time: \(secondsLeft)"
} else {
timerLbl.text = "time: 0"
}
}
Ok, like I mention in the edit 2 of the question:
The first issue was because "seconds" is a Int and then it almost always gains or lose when converting it from Double.
But the main problem was that i had to invalidate the timer when the app enter in background and i didn't.
So now with invalidating the timer when the app gets the notification that will enter background and then starting it when it enter foreground everything works fine.
To test this properly call those methods on button click. It may be coz of delay in releasing some resources in background.
This line executes after 7 seconds, all the time. When I select the restart button, the scene loads after 7 seconds have passed.
if (GUI.Button(new Rect(Screen.width/4+10, Screen.height/4+2*Screen.height/10+10, Screen.width/2-20, Screen.height/10), "RESTART"))
{
Application.LoadLevel(Application.loadedLevel); //works 7 sec later
}
How do I solve this problem?
two reasons for this .. your code which loads objects on scene or something is taking a long time are you relying on resources.load/playerprefs too much or something ?
or you just have a really big scene...
hmm one thing you can do if you have objects that are needed in two scenes or more don't create two objects just don't destroy the objects on load
I have a GPUImageColorDodgeBlend filter with two inputs connected:
A GPUImageVideoCamera which is getting frames from the iPhone video camera.
A GPUImageMovie which is an (MP4) video file that I want to have laid over the live camera feed.
The GPUImageColorDodgeBlend is then connected to two outputs:
A GPUImageImageView to provide a live preview of the blend in action.
A GPUImageMovieWriter to write the movie to storage once a record button is pressed.
Now, before the video starts recording, everything works OK 100% of the time. The GPUImageVideo is blended over the live camera video fine, and no issues or warnings are reported.
However, when the GPUImageMovieWriter starts recording, things start to go wrong randomly. About 80-90% of the time, the GPUImageMovieWriter works perfectly, there are no errors or warnings and the output video is written correctly.
However, about 10-20% of the time (and from what I can see, this is fairly random), things seem to go wrong during the recording process (although the on-screen preview continues to work fine).
Specifically, I start getting hundreds & hundreds of Program appending pixel buffer at time: errors.
This error originates from the - (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex method in GPUImageWriter.
This issue is triggered by problems with the frameTime values that are reported to this method.
From what I can see, the problem is caused by the writer sometimes receiving frames numbered by the video camera (which tend to have extremely high time values like 64616612394291 with a timescale of 1000000000). But, then sometimes the writer gets frames numbered by the GPUImageMovie which are numbered much lower (like 200200 with a timescale of 30000).
It seems that GPUImageWriter is happy as long as the frame values are increasing, but once the frame value decreases, it stops writing and just emits Program appending pixel buffer at time: errors.
I seem to be doing something fairly common, and this hasn't been reported anywhere as a bug, so my questions are (answers to any or all of these are appreciated -- they don't all need to necessarily be answered sequentially as separate questions):
Where do the frameTime values come from -- why does it seem so arbitrary whether the frameTime is numbered according to the GPUImageVideoCamera source or the GPUImageMovie source? Why does it alternative between each -- shouldn't the frame numbering scheme be uniform across all frames?
Am I correct in thinking that this issue is caused by non-increasing frameTimes?
...if so, why does GPUImageView accept and display the frameTimes just fine on the screen 100% of the time, yet GPUImageMovieWriter requires them to be ordered?
...and if so, how can I ensure that the frameTimes that come in are valid? I tried adding if (frameTime.value < previousFrameTime.value) return; to skip any lesser-numbered frames which works -- most of the time. Unfortunately, when I set playsAtActualSpeed on the GPUImageMovie this tends to become far less effective as all the frames end up getting skipped after a certain point.
...or perhaps this is a bug, in which case I'll need to report it on GitHub -- but I'd be interested to know if there's something I've overlooked here in how the frameTimes work.
I've found a potential solution to this issue, which I've implemented as a hack for now, but could conceivably be extended to a proper solution.
I've traced the source of the timing back to GPUImageTwoInputFilter which essentially multiplexes the two input sources into a single output of frames.
In the method - (void)newFrameReadyAtTime:(CMTime)frameTime atIndex:(NSInteger)textureIndex, the filter waits until it has collected a frame from the first source (textureInput == 0) and the second, and then forwards on these frames to its targets.
The problem (the way I see it) is that the method simply uses the frameTime of whichever frame comes in second (excluding the cases of still images for which CMTIME_IS_INDEFINTE(frameTime) == YES which I'm not considering for now because I don't work with still images) which may not always be the same frame (for whatever reason).
The relevant code which checks for both frames and sends them on for processing is as follows:
if ((hasReceivedFirstFrame && hasReceivedSecondFrame) || updatedMovieFrameOppositeStillImage)
{
[super newFrameReadyAtTime:frameTime atIndex:0]; // this line has the problem
hasReceivedFirstFrame = NO;
hasReceivedSecondFrame = NO;
}
What I've done is adjusted the above code to [super newFrameReadyAtTime:firstFrameTime atIndex:0] so that it always uses the frameTime from the first input and totally ignores the frameTime from the second input. So far, it's all working fine like this. (Would still be interested for someone to let me know why this is written this way, given that GPUImageMovieWriter seems to insist on increasing frameTimes, which the method as-is doesn't guarantee.)
Caveat: This will almost certainly break entirely if you work only with still images, in which case you will have CMTIME_IS_INDEFINITE(frameTime) == YES for your first input'sframeTime.
As the title states, sounds will just play at ridiculously quiet volumes. I thought it wasn't playing at all until I heard sounds barely audible at max volume.
It's been fine for weeks. Not a problem. Suddenly it started doing this out of nowhere. I've tried cleaning/rebuilding, restarting Xcode, restarting device, etc. Nothing fixes it. It'll just suddenly come back again in a random build. This is really starting to frustrate me and I'm disappointed I went with OpenAL, as this isn't the first problem I've had with it in just a few weeks of using it. I can't release anything with sound that MIGHT work sometimes.
I'm using AVAudioPlayer simultaneously to play background music, and have not had a single problem with that.
Anyone have any idea what could be causing this?
In response to user1260708:
My init method looks like this
bool D2DSoundController::init(Map<D2DSound> *sounds) {
if (!_initialzed) {
OSStatus result = AudioSessionInitialize(NULL, NULL, &_interruptionListener, (void*)_context);
ulong cat = kAudioSessionCategory_MediaPlayback;
result |= AudioSessionSetProperty(kAudioSessionProperty_AudioCategory, sizeof(cat), &cat);
_sounds = sounds;
_device = alcOpenDevice(0);
if (_device != 0) {
_context = alcCreateContext(_device, 0);
if (_context != 0) {
alcMakeContextCurrent(_context);
alGenSources(_numSources, _sources);
//alDistanceModel(AL_LINEAR_DISTANCE_CLAMPED);
//alListener3f(AL_POSITION, 300.0f, 200.0f, 0.0f); --- I think this was the problem?
_initialzed = true;
return (alGetError() == 0 && result == 0);
}
return false;
}
return false;
}
return false;
}
However, I think I found the issue. After commenting out the alListener3f call (just noticed it as the line above had been commented out for a while), it seems to be working again. I don't see why this suddenly became an issue though, as it's been there the whole time; distance attentuation wasn't working right for me either (reason that call was there in the first place, for testing), so I decided overall not to use it, but I guess I forgot to remove that line.
See this question: Distance Attenuation with OpenAL on iPhone/iPod
Any input on the attentuation issue (which seems to be the root of this as well) would be great though!
As I side note, since I dropped attentuation, I've been using stereo as well, which shouldn't be affecting that, so I'm still confused.
trying to play around with the Cocos2d effects and created to methods to display and stop the Liquid action. My application however drops from 60fps down to 30fps when the effect is applied but the fps doesnt increase again when the scheduled stop action is called.
I originally thought that while the action has been completed the effect is still being rendered but after reading through the EffectsTest.h/.m in the Cocos2D 0.8 zip I cant find any reference to how this is achieved. Can anyone shed some light on this issue?
// effects
-(void)enableLiquidEffect
{
id liquid = [Liquid actionWithWaves:6 amplitude:20 grid:ccg(15, 10) duration:3];
[self schedule:#selector(disableLiquidEffect) interval:(3.0)];
[self runAction:liquid];
}
-(void)disableLiquidEffect
{
[self unschedule:#selector(disableLiquidEffect)];
[self stopAllActions];
}
Cheers,
AntonMills
Just a little tip here i know this was asked years ago, but someone might still come here
the code is a little bit overkill, here's how to do it:
// effects
-(void)enableLiquidEffect
{
id liquid = [Liquid actionWithWaves:6 amplitude:20 grid:ccg(15, 10) duration:3];
//No need to unschedule after 3 seconds since you already set duration-^ to 3 seconds.
[self runAction:liquid];
}
-(void)disableLiquidEffect
{
[self stopAllActions];
}
besides that the code is perfect
Just a guess but since the item will still have a transform set to liquid that it is still trying to apply a more complex transform then needed after its done. Save off your transform prior to starting and then when it stops set it back. You could try just setting it to nil.