Cocos2D - Why is my FPS so low? Single scene with tilemap - iphone

I wrote a simple game with Cocos2D where I have an animated sprite walking around a plain green field.
The walking sprite is 64x64px, the joystick comes from the Sneakyjoystick class, and the green field is composed of a 5-6 32x32 pixels made into a tilemap using Tiled. Using Tiled, I created a 50 x 50 tile map.
I'm getting 20-30 FPS and I'm not sure why. I decreased the tilemap size to 10x10 tiles and it shot up to 50-60 FPS. I don't know what to do?
Edit: I've added the initialization block for the gameplay layer. I'm testing on the simulator.
-(id) init
{
if( (self=[super init] ))
{
self.tileMap = [CCTMXTiledMap tiledMapWithTMXFile:#"TileMap.tmx"];
self.background = [_tileMap layerNamed:#"Background"];
self.meta = [_tileMap layerNamed:#"Meta"];
self.topLayer = [_tileMap layerNamed:#"TopLayer"];
_topLayer.visible = YES;
_background.visible = YES;
_meta.visible = YES;
[self addChild:_tileMap z:-1];
[[CCSpriteFrameCache sharedSpriteFrameCache]addSpriteFramesWithFile:#"male_walkcycle.plist"];
sceneSpriteBatchNode = [CCSpriteBatchNode batchNodeWithFile:#"male_walkcycle.png"];
[self addChild:sceneSpriteBatchNode z:0];
CCTMXObjectGroup *objects = [_tileMap objectGroupNamed:#"Objects"];
NSAssert(objects != nil, #"'Objects' object group not found");
NSMutableDictionary *spawnPoint = [objects objectNamed:#"SpawnPoint"];
NSAssert(spawnPoint != nil, #"SpawnPoint object not found");
int x = [[spawnPoint valueForKey:#"x"] intValue];
int y = [[spawnPoint valueForKey:#"y"] intValue];
Caster *casterPlayer = [[[Caster alloc] init] initWithSpriteFrame:[[CCSpriteFrameCache sharedSpriteFrameCache] spriteFrameByName:#"male_walkcycle_19.png"]];
casterPlayer.position = ccp(x, y);
casterPlayer.joystick = nil;
[sceneSpriteBatchNode addChild:casterPlayer z:1000 tag:kCasterTagValue];
[casterPlayer release];
CCSprite *sprite = [CCSprite node];
CGSize size = CGSizeMake(50,50);
GLubyte *buffer = malloc(sizeof(GLubyte)*4);
for (int i=0;i<4;i++) {buffer[i]=255;}
CCTexture2D *tex = [[CCTexture2D alloc] initWithData:buffer pixelFormat:kCCTexture2DPixelFormat_RGB5A1 pixelsWide:1 pixelsHigh:1 contentSize:size];
[sprite setTexture:tex];
[sprite setTextureRect:CGRectMake(0, 0, size.width, size.height)];
free(buffer);
sprite.position = ccp(x, y);
sprite.visible = NO;
[self addChild:sprite z:5 tag:debugBoxTagValue];
[self setViewpointCenter:casterPlayer.position];
[self scheduleUpdate];
}
return self;
}

I'm testing on the simulator.
I can't believe how often I need to repeat this:
Ignore the Simulator!
The Simulator is not an actual iOS device. It runs on your Mac. It has more memory and CPU power available. At the same time its rendering performance is severely limited because it's not hardware accelerated rendering, it's a software renderer.
Even the fastest 2012 iMac can't run scenes at 60 fps on the iPad or iPhone Retina Simulator. On the iPad Retina Simulator you'll be lucky to get 20 fps.
Moreover, developers are the only people ever to run their Apps on the Simulator. Your users, not even your testers, will be using the Simulator to run your app. They'll be using an actual device.
Any time spent optimizing performance on the Simulator is nothing but a complete waste of time. This generally applies to any issue that only occurs on the Simulator.
Test on a device! Every issue only observed on the iOS Simulator is moot, void, invalid, and utterly irrelevant, especially performance.

Related

How to scale Cocos2d app to iPhone 5 screen?

I have a Cocos3d v1.01 app and I am currently making it try to fit the iPhone 5 full screen.
By simply stretching the background is enough for most screens as it is just a pattern that fills the screen and no need to change any UI parts, however on just one of the screens I am struggling to change.
This is the code part I believe to set the background for that;
-(void)animations
{
AppDelegate *app=(AppDelegate *)[[UIApplication sharedApplication]delegate];
CGSize size = [[CCDirector sharedDirector] winSize];
NSMutableArray *bodyanimframeBuddyBack =[[NSMutableArray alloc]init];
for(int j = 1;j<=4;++j)
{
[bodyanimframeBuddyBack addObject:[[CCSpriteFrameCache sharedSpriteFrameCache]spriteFrameByName:[NSString stringWithFormat:#"shark_sea_waves5%d.png",j]]];
}
CCAnimation *BuddyAnimBack = [CCAnimation animationWithFrames:bodyanimframeBuddyBack delay:0.3f];
self.backgroundAction = [CCRepeatForever actionWithAction:[CCAnimate actionWithAnimation:BuddyAnimBack restoreOriginalFrame:NO]];
CCSprite *bg=[CCSprite spriteWithSpriteFrameName:#"shark_sea_waves51.png"];
bg.position=ccp(size.width/2, size.height/2);
[self addChild:bg];
[bg runAction:backgroundAction];
Is there a way of adjusting that ? I am guessing it is the bg.position part that would need changing to tell it whether it is iPhone5 or not?
Thanks in advance,
Chris
bg.scaleX = size.width / bg.contentSize.width;
This will stretch the bg width to match the screen width.
Why can't you place seperate background for iPhone5..just like you did for iPad.
#define IS_IPHONE5 (([[UIScreen mainScreen] bounds].size.height-568)?NO:YES)
#define TEX_MM_BG (IS_IPHONE5) ? ( #"shark_sea_waves51-whd.png") : ( #"shark_sea_waves51.png")
-(void)setupBackground
{
CCSprite *bg = [CCSprite spriteWithFile:TEX_MM_BG];
bg.position = ccp(mS.width*0.5f, mS.height*0.5f);
[self addChild:bg z:-3 tag:kTagBackground];
}
//Also put these image in hard disk.
shark_sea_waves51.png //480x320
shark_sea_waves51-hd.png //960x640
shark_sea_waves51-whd.png //1136x640
shark_sea_waves51-ipad.png //1024x768
shark_sea_waves51-ipadhd.png //2048x1536

Small FPS with Background Image

I have a big problem with my app.
I am few steps before the end of my app and today I wanted to add a background image.
My normal FPS is 180 during black background and when I added my background for iphone 4 sized 960/640 my FPS changed to 20-30 so its impossible to play. And next thing what happens is, when my player and enemy has a collision they stop their actions, but only if there is my background. I am using SpaceManager so maybe its because of its shapes. So does anyone know how to add background in another way in SpaceManager to do not slow my FPS? Here is the way I try to add Background:
-(id) init
{
CCSprite *background = [CCSprite spriteWithFile:#"Background.png"];
background.anchorPoint = ccp(0, 0);
[self addChild:background z:-1];
}
the most likely cause for the drop in fps is because the picture is designed for the retina display, and since it is a large file it slows down the app. the device which you are using also has an effect on how fast the game works, so if its slow only on a 3gs or something like that just drop support for it.
So my Objects are statit mass, so I dont have there [sm addcollision] but I have there this code
-(void) KolizeHraceSProtiHracem
{
if (player.position.x - enemy.position.x < 40 )
{
[enemy stopAllActions];
[player stopAllActions];
[player runAction:[CCMoveTo actionWithDuration:0.2 position:ccp(player.position.x + 40, 10)]];
[enemy runAction:[CCMoveTo actionWithDuration:0.1 position:ccp(enemy.position.x - 10, 10)]];
}
I dont think, this code is bad, because if I dont have a background this code works but if i do this code stops all actions.

IOS 6 Face Detection Not Working

I have used the following code to detect the face for IOS 5
CIImage *cIImage = [CIImage imageWithCGImage:image.CGImage];
CIDetector* detector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:[NSDictionary dictionaryWithObject:CIDetectorAccuracyHigh forKey:CIDetectorAccuracy]];
NSArray *features = nil;
features = [detector featuresInImage:cIImage];
if ([features count] == 0)
{
NSDictionary* imageOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:6] forKey:CIDetectorImageOrientation];
features = [detector featuresInImage:cIImage options:imageOptions];
}
With this code, i am able to detect the face in IOS 5, But recently we have upgraded our Systems to Xcode 4.4 and IOS 6, Now, the face Detection is not working properly,
What changes i need to do for detecting the face in IOS 6.
Anykind of Help is highly Appreciated
I have noticed that face detection in iOS6 is not as good as in iOS5.
Try with a selection of images. You will likely find that it works OK in iOS6 with a lot of the images, but not all of them.
I have been testing the same set of images in:
1. The emulator running iOS6.
2. iPhone 5 (iOS6)
3. iPhone 3GS (iOS5).
The 3GS detects more faces than the other two options.
Here's the code, it works on both, but just not as well on iOS6:
- (void)analyseFaces:(UIImage *)facePicture {
// Create CI image of the face picture
CIImage* image = [CIImage imageWithCGImage:facePicture.CGImage];
// Create face detector with high accuracy
CIDetector* detector = [CIDetector detectorOfType:CIDetectorTypeFace
context:nil options:[NSDictionary dictionaryWithObject:CIDetectorAccuracyHigh forKey:CIDetectorAccuracy]];
// Create array of detected faces
NSArray* features = [detector featuresInImage:image];
// Read through the faces and add each face image to facesFound mutable
for(CIFaceFeature* faceFeature in features)
{
CGSize parentSize = facePicture.size;
CGRect origRect = faceFeature.bounds;
CGRect flipRect = origRect;
flipRect.origin.y = parentSize.height - (origRect.origin.y + origRect.size.height);
CGImageRef imageRef = CGImageCreateWithImageInRect([facePicture CGImage], flipRect);
UIImage *faceImage = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
if (faceImage)
[facesFound addObject:faceImage];
}
}
I hope this helpful for u...
add the CoreImage.framework
-(void)faceDetector
{
// Load the picture for face detection
UIImageView* image = [[UIImageView alloc] initWithImage:
[UIImage imageNamed:#"facedetectionpic.jpg"]];
// Draw the face detection image
[self.window addSubview:image];
// Execute the method used to markFaces in background
[self markFaces:image];
}
-(void)faceDetector
{
// Load the picture for face detection
UIImageView* image = [[UIImageView alloc] initWithImage:[UIImage
imageNamed:#"facedetectionpic.jpg"]];
// Draw the face detection image
[self.window addSubview:image];
// Execute the method used to markFaces in background
[self performSelectorInBackground:#selector(markFaces:) withObject:image];
// flip image on y-axis to match coordinate system used by core image
[image setTransform:CGAffineTransformMakeScale(1, -1)];
// flip the entire window to make everything right side up
[self.window setTransform:CGAffineTransformMakeScale(1, -1)];
}
and check this two links also
http://maniacdev.com/2011/11/tutorial-easy-face-detection-with-core-image-in-ios-5/
http://i.ndigo.com.br/2012/01/ios-facial-recognition/

Putting two backgrounds one after the other in objective-c iphone game?

Hi I am creating an iphone game
in my game layer, I have two identical background images but I want them to go one after the other.
Once the game animal (eg a penguin) crosses the first background, I want the first background to go after the second one-- becoming a continuous background until the game is over.
I have tried everything--- for loops, while loops and such but nothing seems to work
Does anyone have an idea for how I could go about doing this?
Thank you for any help you can provide me.
this is all I have so far after many different tries
- (id) init
{
if((self = [super init]))
{
for ( int x = 0; x < 10000000; ++x)
{
CCSprite *bg = [CCSprite spriteWithFile:#"level1.png"];
[bg setPosition:ccp(160,240)];
ccTexParams params = {GL_LINEAR, GL_LINEAR, GL_REPEAT, GL_REPEAT};
[bg.texture setTexParameters:&params];
[self addChild:bg z:0];
CCSprite *bg2 = [CCSprite spriteWithFile:#"level1.png"];
[bg2 setPosition:ccp(320,480)];
ccTexParams param = {GL_LINEAR, GL_LINEAR, GL_REPEAT, GL_REPEAT};
[bg2.texture setTexParameters:&param];
[self addChild:bg z:0];
}
If the backgrounds are the identical you don't actually need 2 images, you can just set the texture rect position of the one and it will continuously move. If you call moveBackground on a timer or in the update method it will continually scroll.
-(void)init
{
if((self=[super init]))
{
background = [CCSprite spriteWithFile:#"background.png"];
ccTexParams params = {GL_LINEAR,GL_LINEAR,GL_REPEAT,GL_REPEAT};
[background.texture setTexParameters:&params];
[self addChild:background z:0];
}
}
-(void)moveBackground
{
// Scroll background 5 pixels to the left
int speed = 5;
background.textureRect = CGRectMake(background.textureRect.origin.x-speed,
background.textureRect.origin.y,
background.textureRect.size.width,
background.textureRect.size.height);
}

Why is drawInContext so slow here?

I'm using the QuartzImage class in one of the demo projects and what I'm was trying to achieve was a simple frame display unit that basically draws an image (320x480) every 1/10th of sec. So my "frame rate" should be 10 frames per sec.
In the QuartzImage demo class, there is a drawInContext method and in this method, it's basically drawing a CGImageRef using the CGContextDrawImage(), I measured the time it took to finish complete and it's taking on average around ~200ms.
2011-03-24 11:12:33.350 QuartzDemo[3159:207] drawInContext took 0.19105 secs
-(void)drawInContext:(CGContextRef)context
{
CFAbsoluteTime start = CFAbsoluteTimeGetCurrent();
CGRect imageRect;
imageRect.origin = CGPointMake(0.0, 0.0);
imageRect.size = CGSizeMake(320.0f, 480.0f);
CGContextDrawImage(context, imageRect, image);
CFAbsoluteTime end = CFAbsoluteTimeGetCurrent();
NSLog(#"drawInContext took %2.5f secs", end - start);
}
Can anyone explain why it's taking that long and if there is any other way of improving the performance? 200ms just seems much more longer than it should have taken.
UPDATES
I tried #Brad-Larson's suggestion but not seeing a lot of performance improvement.
So the updated version is I got my own class
#interface FDisplay : UIView {
CALayer *imgFrame;
NSInteger frameNum;
}
end
So in my Class implementation
- (id)initWithFrame:(CGRect)frame {
............
frameNum = 0;
NSString *file = [NSString stringWithFormat:#"frame%d",frameNum];
UIImage *img = [UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:file ofType:#"jpg"]];
imgFrame = [CALayer layer];
CGFloat nativeWidth = CGImageGetWidth(img.CGImage);
CGFloat nativeHeight = CGImageGetHeight(img.CGImage);
CGRect startFrame = CGRectMake(0.0, 0.0, nativeWidth, nativeHeight);
imgFrame.contents = (id)img.CGImage;
imgFrame.frame = startFrame;
CALayer *l = [self layer];
[l addSublayer:imgFrame];
}
I have a NSTimer going at 0.1f calling my refresh method
NSString *file = [NSString stringWithFormat:#"frame%d",frameNum];
UIImage *img = [UIImage imageWithContentsOfFile:[[NSBundle mainBundle] pathForResource:file ofType:#"jpg"]];
frameNum++;
if (frameNum>100)
frameNum = 0;
[CATransaction begin];
[CATransaction setValue:(id)kCFBooleanTrue forKey:kCATransactionDisableActions];
[imgFrame setContents:(id)img.CGImage];
[CATransaction commit];
end = CFAbsoluteTimeGetCurrent();
NSLog(#"(%d)refresh took %2.5f secs", [[self subviews] count],end - start);
I think I got everything right but the frame rate is still way low,
refresh took 0.15960 secs
Using Quartz to draw out images is about the slowest way you could do this, due to the way that content is presented to the screen.
As a step up, you could use a UIImageView and swap out its hosting image for each frame. An even faster approach might be to use a CALayer and set its contents property to a CGImageRef representing each image frame. You may need to disable implicit animations using a CATransaction for this to be as fast as it can be. If I recall, I was able to get 320x480 images to be drawn at over 15 FPS on an original iPhone using the latter method.
Finally, for optimal performance you could set up an OpenGL ES display with a rectangle that filled the screen and supply the images as textures to be rendered on that rectangle. This would require significantly more code, but it would be extremely fast.
I have been porting my game from Android to iPhone and I was shocked by the performance of Quartz.
The same code on Android is 10x faster than on iPhone. I have a few DrawImages, and a bunch of line draws and beziers.
It's so slow on iOS, specially iPhone 4, where the processor struggles to keep up with the retina display resolution.
My game performed perfectly at 60fps in almost ANY android device.
I tried several approaches for rendering (always avoiding OpenGL). I started by drawing everything at every frame. Then I started rendering as much as I could before the game loop, by using UIImage's. Now I'm trying to go with CALayers. Although I can see the game at steady 60FPS on iPhone 3GS and 4S, the most I can get on the iPhone 4 is 45 FPS.