SpriteKit scene with low fps on start - swift

My problem is that my GameScene starts with about double the nodes and draws count and 40 fps for several seconds. This problem appears only on my iPad (mini retina) while on my iPhone (5) the game runs smoothly from the start although the nodes are still a lot more than what it should be

This 40fps problem is an issue within the iOS frameworks. It's created by some kind of throttling (by iOS).
Perhaps this throttling was designed to give a more consistent experience for an app struggling to maintain 60fps. But nobody knows.
Apple has never commented on it.
It became prevalent throughout the rollout of iOS 9 in Scene Kit, Sprite Kit and Metal. But has been seen within OpenGL ES locked projects, too.
However it was noticed in previous versions of iOS, too. Particularly within apps/games using CADisplayLink.
//Don't worry about those asking for code, you're right in your assumption that something is wrong that's not pertinent to your code.
Here's a deeper examination of a similar issue (probably from the same route cause) within Scene Kit: Inconsistent SceneKit framerate

Also got a problem with low FPS at the start (when using spritekit and uikit together )
I used xib with SKView and for me the solution was to add a dependency to the scene in the 'attribute inspector'
like at the picture below:

Related

scene transition causes memory leak

I'm making a simple game using Xcode 10.1 IDE and swift 4.2
I've designed the main menu system so that each pages UI is presented through a separate scene.
Each time a new scene is loaded I get approximately 0.1 MB increase in memory usage. Not much but I don't want to start scaling the game with this issue.
Memory leak when presenting SpriteKit scenes
Memory problems when switching between scenes SpriteKit
Tab-based SpriteKit Apps and Scene Caching
I've had a good look through reference material and online. I've checked for Retain Cycles, through the following: Inserted deinit statements at the end of all scenes and object classes used. They are all called correctly. Profiled the app to look looking for zombie objects and leaks, with nothing obvious shown in the profiler results when running.
Does anyone have any idea to what causes memory leaks or caching on scene transitions and ways to prevent this?
I'm stumped scaling the game now seems like the wrong thing to do as the issue will probably compound as complexity increases.
After further testing I found the issue is with attaching sound actions through scene editor.
Attaching the actions through code gives stable memory use, with no increase of 0.1MB through scene transition.
Its only a partial answer but if any one else comes across this and has further info please post

120 FPS in Xcode Debug Navigator vs ~60 FPS in SKScene - Swift

I'm building a SpriteKit game and have had a minor lag that I can't get to fully go away. CPU usage tends to vary, however, I see this lag regardless of whether CPU usage is >40% or <20%.
I've combed through the code to try to make it as efficient as possible (e.g., reusing nodes, hardcoding values, optimizing conditionals, etc.) and I'm about out of ideas.
I've noticed this difference between the FPS per the debug tool (120 FPS) and what I see within the SKView (~60 FPS, but never lower than about 58 FPS). My thought was that maybe the view controller running at a higher FPS is somehow stealing resources from the SKScene. I've messed around with preferredFrameRate, trying to set it to 120 in the SKScene (I don't think I can set it on the view controller) but that didn't change the rate on SKScene.
I'm using an iPhone 6s, so not exactly the newest iPhone, but I also downloaded another ball bounce game, One More Brick, and there's basically no lag whatsoever, so I don't think it's my phone.
Anyone familiar with this issue?

Screen sizes in Xcode 7 the using SpriteKit

I am relatively new to sprite kit and have been attempting to create my first basic game. All physics and other basics seem ok, but for some reason whenever I build and run the screen dimensions are off (looks like default is 1024×768)?
Pretty sure I'm missing something fundamental here but it doesn't seem immediately obvious on how to adapt the screen to any size iPhone screen (this is my ultimate goal).
My question is whether this is actually just a setting issue or is it necessary to implement code?
Thanks in advance and have a great day!:)
To answer the first part, you can easily change the size of your scene.
If you take the default GameScene, click out of the scene and look at the Attributes Inspector. You will see the default size of 1024,768. Personally if landscape I tend to work with an iPhone 5 design resolution of 568,320.
Regarding multiple devices, SpriteKit works pretty well out the box. You should look at the documentation regarding scaleMode, take a look in the GameViewController.swift. .AspectFit worked really well, nearly pefect across all devices apart from a little letterboxing on iPad. However, for the amount of effort put in, more than good enough.
On a side note, I've found the following iPhone Resolution Guide resource useful in the past.

iPhone Opengl game with ads == fps problem?

I have a game that runs fine as is (around 30fps), but fps went down the drain when I tried to implement ads. I tried Greystripe and iAds but with same result (iAds were maybe bit worse). Average fps is almost same, but there are huge spikes all the times (1-2 spikes per second) and game is unplayable.
I guess it is because ad is in another view. I read somewhere that opengl apps on iphone don't like having another views with them, but there is plenty of games with ads on app store. How do they do it?
My implementation should be ok. I did everything as documentation and samples told me. I have my opengl view and ad view as subviews in app window, adview being in front of opengl view and thus covering part of it. Could this be the problem? Is it better to make opengl view smaller to left space for ad so they don't overlap? Do you have any other ideas what could be wrong?
Lope, I've created a gist at this link with a singleton "AdManager" class I wrote to handle iAds using cocos2d. Cocos2d sits on top of OpenGL, of course, and I've found that this code doesn't affect FPS even for relatively complicated games.
You'll have to modify this a bit to work with your application, changing out the cocos2d calls, etc, but this will give you asynchronous loading of iAds, which should help the FPS issue.
To use this class, include its header and call
[[AdManager sharedManager] attachAdToView:self.view];
wherever you need iAds. The ads will remain hidden until an ad loads, at which time they'll pop up at the top of screen. (The class works for iOS 4.0, 4.1 and 4.2).
Also, I should add that I have cocos2d running inside of an overall UIViewController that I call "Cocos2DController". When I attach the ads to a cocos2d view, I'm using
[[AdManager sharedManager] attachAdToView:[[CCDirector sharedDirector] openGLView]];
Best of luck!
We can hit and miss with apple's choices, but go for the sure thing and implement the ads in other parts to be appealing and not intrusive. It will be better for the framerate, and for you.
Try downloading the ads in a seperate, low priority, thread. You can, thus, nsure that the ads loading does not take too much CPU time. With a bit of CPU synchronisation you can make sure you don't try to display the new ad until it is completely ready to display. Sure it will suck some CPU time away from what you are trying to do but set your priorities right and it should only suck time when you are busy doing nothing.
Please excuse the thread necro'ing here, but I've used Stack Overflow a lot to help me through the problems I've had during coding, and thought my experience might be useful to someone in the future.
My simple cocos2d game ran with decent FPS (rarely changed the FPS display at all) until I implemented AdWhirl (integrating AdMob + iAd only). It would then run OK for the first few iterations, but upon upon the 9th or 10th scene refresh (single screen game, time in each scene < 5 seconds on average) the FPS would dive to ~20FPS, and drop again each time the scene refreshed.
Turns out, in my n00biness (this may be particular to me :) ), I was calling the scene from within itself. That is, once the actions had finished, the last action was to call the main scene again (a lazy way of rebuilding the scene for the user to have another go). This init'd the views and view controllers I had inserted to handle the AdWhirl ads all over again, and not only did I have a memory leak, I had 10+ view controllers all trying to request and service ads from AdWhirl. Once I got a clue and took that self-referring loop out, all was good.

iPhone + OpenGL + Touches: FPS drop

Recently I ran into a very strange issue: touching the screen of the iPhone and moving a finger around can eat up to 50% of my FPS. Yeah, I checked my code for possible bottlenecks – not the issue. The last resort I tried before writing this post – commenting out all the touch processing code and looking at FPS then. Results are: no touches – 58-60. Touching and moving the finger – 35-40 FPS instantly.
The rendering is done in a separate thread, so that no main runloop events shall collide with it. However, it's very crushial for me (and the game I develop) to resolve this issue, because such FPS drop is really noticeable.
Thank you for your help in advance.
UPDATE: seems that setting rendering thread's priority to higher value helps a bit...
The iPhone, iPod Touch, and iPad are all single-processor, single-core devices. Simply putting your rendering code on a separate thread from touch event handling—though a good idea—won't prevent the touch processing from eating up CPU cycles. The only way to make your framerates go up will be to either make the touch handling code faster or make the rendering faster. Which you pursue depends on the specifics of your application.