Cocos2d low image quality - ios5

I am now developing a game for iOS .I create a sprite sheet with Texturepacker which produces 2 file .plist and .png file . When I use these files in my code I found that the images of the all sprites have low quality and the color are pale very pale and I think this is because texturepacker program .
Please I want your advise what shall I do to overcome this problem .

I guess u didn't change back buffer pixel formate. U can try this.
Used pixel formate kEAGLColorFormatRGBA8 instead of kEAGLColorFormatRGB565.
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
// Create the main window
window_ = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
// Create an CCGLView with a RGB565 color buffer, and a depth buffer of 0-bits
CCGLView *glView = [CCGLView viewWithFrame:[window_ bounds]
pixelFormat:kEAGLColorFormatRGBA8 //Guru - replaced kEAGLColorFormatRGBA8 / kEAGLColorFormatRGB565
depthFormat:0 //GL_DEPTH_COMPONENT24_OES
preserveBackbuffer:NO
sharegroup:nil
multiSampling:NO
numberOfSamples:0];

Related

Cocos2d - CCSprite image texture gets blurred

i am using the latest cocos2d to create a game project,i have added the images having respective retina and normal images.But when ever i add a ccsprite with its initialize method ,initWithFileName:(NSString*)fileName ,the image texture gets blurred while running ,in retina and non retina devices
For all texture OR only for some texture? If it is for all texture then change pixelFormat.
In AppController, set backbuffer pixelFormat to kEAGLColorFormatRGBA8.
CCGLView *glView = [CCGLView viewWithFrame:[window_ bounds]
pixelFormat:kEAGLColorFormatRGBA8 //kEAGLColorFormatRGBA8
depthFormat:0 //GL_DEPTH_COMPONENT24_OES
preserveBackbuffer:NO
sharegroup:nil
multiSampling:NO
numberOfSamples:0];
Please try this on your sprite:
[myccsprite setAliasTexParameters];
It will tell the sprite not to anti-alias.
In AppController,i set backbuffer pixelFormat to kEAGLColorFormatRGBA8.This really worked
CCGLView *glView = [CCGLView viewWithFrame:[window_ bounds]
pixelFormat:kEAGLColorFormatRGBA8 //kEAGLColorFormatRGBA8
depthFormat:0 //GL_DEPTH_COMPONENT24_OES
preserveBackbuffer:NO
sharegroup:nil
multiSampling:NO
numberOfSamples:0];

How to develop a Face recognition iPhone app? [closed]

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 10 years ago.
I am trying to develop an iPhone for Face recognition/detection. In my app i want to make my iPhone camera should be auto focused and auto capture.
How to recognition the face from iPhone app?
It is possible to auto focus the face and auto capture in our iPhone app. If it is possible can anyone please help to do this? I just want any suggestion/ideas and tutorials about that.
Can you please help me? Thanks in advance.
Core Image has a new CIFaceDetector to detect faces in real time; you can start with these examples to take an overview:
SquareCam
iOS Facial Recognition
Easy Face detection with Core Image
Check this code.You have to import following:-
CoreImage/CoreImage.h
CoreImage/CoreImage.h and after that use the code:-
-(void)markFaces:(UIImageView *)facePicture
{
// draw a CI image with the previously loaded face detection picture
CIImage* image = [CIImage imageWithCGImage:facePicture.image.CGImage];
// create a face detector - since speed is not an issue we'll use a high accuracy
// detector
CIDetector* detector = [CIDetector detectorOfType:CIDetectorTypeFace
context:nil options:[NSDictionary dictionaryWithObject:CIDetectorAccuracyHigh forKey:CIDetectorAccuracy]];
// create an array containing all the detected faces from the detector
NSArray* features = [detector featuresInImage:image];
// we'll iterate through every detected face. CIFaceFeature provides us
// with the width for the entire face, and the coordinates of each eye
// and the mouth if detected. Also provided are BOOL's for the eye's and
// mouth so we can check if they already exist.
for(CIFaceFeature* faceFeature in features)
{
// get the width of the face
CGFloat faceWidth = faceFeature.bounds.size.width;
// create a UIView using the bounds of the face
UIView* faceView = [[UIView alloc] initWithFrame:faceFeature.bounds];
// add a border around the newly created UIView
faceView.layer.borderWidth = 1;
faceView.layer.borderColor = [[UIColor redColor] CGColor];
// add the new view to create a box around the face
[self.window addSubview:faceView];
if(faceFeature.hasLeftEyePosition)
{
// create a UIView with a size based on the width of the face
UIView* leftEyeView = [[UIView alloc] initWithFrame:CGRectMake(faceFeature.leftEyePosition.x-faceWidth*0.15, faceFeature.leftEyePosition.y-faceWidth*0.15, faceWidth*0.3, faceWidth*0.3)];
// change the background color of the eye view
[leftEyeView setBackgroundColor:[[UIColor blueColor] colorWithAlphaComponent:0.3]];
// set the position of the leftEyeView based on the face
[leftEyeView setCenter:faceFeature.leftEyePosition];
// round the corners
leftEyeView.layer.cornerRadius = faceWidth*0.15;
// add the view to the window
[self.window addSubview:leftEyeView];
}
if(faceFeature.hasRightEyePosition)
{
// create a UIView with a size based on the width of the face
UIView* leftEye = [[UIView alloc] initWithFrame:CGRectMake(faceFeature.rightEyePosition.x-faceWidth*0.15, faceFeature.rightEyePosition.y-faceWidth*0.15, faceWidth*0.3, faceWidth*0.3)];
// change the background color of the eye view
[leftEye setBackgroundColor:[[UIColor blueColor] colorWithAlphaComponent:0.3]];
// set the position of the rightEyeView based on the face
[leftEye setCenter:faceFeature.rightEyePosition];
// round the corners
leftEye.layer.cornerRadius = faceWidth*0.15;
// add the new view to the window
[self.window addSubview:leftEye];
}
if(faceFeature.hasMouthPosition)
{
// create a UIView with a size based on the width of the face
UIView* mouth = [[UIView alloc] initWithFrame:CGRectMake(faceFeature.mouthPosition.x-faceWidth*0.2, faceFeature.mouthPosition.y-faceWidth*0.2, faceWidth*0.4, faceWidth*0.4)];
// change the background color for the mouth to green
[mouth setBackgroundColor:[[UIColor greenColor] colorWithAlphaComponent:0.3]];
// set the position of the mouthView based on the face
[mouth setCenter:faceFeature.mouthPosition];
// round the corners
mouth.layer.cornerRadius = faceWidth*0.2;
// add the new view to the window
[self.window addSubview:mouth];
}
}
}
-(void)faceDetector
{
// Load the picture for face detection
UIImageView* image = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"facedetectionpic.jpg"]];
// Draw the face detection image
[self.window addSubview:image];
// Execute the method used to markFaces in background
[self performSelectorInBackground:#selector(markFaces:) withObject:image];
// flip image on y-axis to match coordinate system used by core image
[image setTransform:CGAffineTransformMakeScale(1, -1)];
// flip the entire window to make everything right side up
[self.window setTransform:CGAffineTransformMakeScale(1, -1)];
}
- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
{
self.window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
// Override point for customization after application launch.
self.viewController = [[ViewController alloc] initWithNibName:#"ViewController" bundle:nil];
self.window.rootViewController = self.viewController;
[self.window makeKeyAndVisible];
[self faceDetector]; // execute the faceDetector code
return YES;
}
Hope it helps thanks :)

CCShaky3D turns the background black

I'm trying to make my sprite have a shake effect. However, while the sprite does shake, the entire background turns black. Can anybody help me with this?
Here's the code that I've written to add the sprite to my layer along with the action that I run right after.
CCSprite * picture = [CCSprite spriteWithFile:#"picture.png"];
picture.position = ccp(winsize.width/4,
picture.contentSize.height * 0.8);
[self addChild:picture];
CCShaky3D * shake = [CCShaky3D actionWithRange:4
shakeZ:NO
grid:ccg(12, 12)
duration:0.5];
[picture runAction:shake];
Can anybody help me?
Have you enabled depth buffering of the EAGLView? Most 3D actions require depth buffering (GL_DEPTH_COMPONENT16_OES or GL_DEPTH_COMPONENT24_OES) to avoid visual artifacts. You may also have to use a 32-Bit frame buffer with alpha channels by using the kEAGLColorFormatRGBA8 instead of kEAGLColorFormatRGB565.
EAGLView is initialized in the app delegate class:
EAGLView* glView = [EAGLView viewWithFrame:[window bounds]
pixelFormat:kEAGLColorFormatRGBA8
depthFormat:GL_DEPTH_COMPONENT24_OES
preserveBackbuffer:NO
sharegroup:nil
multiSampling:0
numberOfSamples:0];

Changing z-order of draw method in Cocos2d?

Right now I am trying to draw a polygon in Cocos2d, but I need it over a background, how do I change draw's z-index and make it top priority so that way I can see the line instead of it being covered by the background?
Here is the draw method:
I decided to just make it a line instead of polygon, and adjust the width of it but otherwise it is the same...
-(void) draw {
glColor4f(1.0f, 0.0f, 0.0f, 1.0f);
glLineWidth(5.0f);
ccDrawLine(healthBar[0], healthBar[1]);
}
Use reorderChild function of CCNode. And put your rectangle over background
-(void) reorderChild:(CCNode *) child z:(int) zOrder
In your app delegate, you need to modify your GLView so that it has depth enabled. Then, you should be able to modify the depth of your draw call. In your app delegate, look for something like this:
EAGLView *glView = [EAGLView viewWithFrame:[window bounds]
pixelFormat:kEAGLColorFormatRGBA8 // kEAGLColorFormatRGBA8
depthFormat:0 // GL_DEPTH_COMPONENT16_OES
];
Change the 0 to either GL_DEPTH_COMPONENT16_OES or GL_DEPTH_COMPONENT24_OES.
Edit:
I'm not familiar with OpenGL, but perhaps this page will help you.

Is it possible to get a 16-bit depth buffer on the iPhone?

We're making a game using cocos2d but are having problems with the depth buffer.
I'm trying to setup a 16-bit depth buffer on the iPhone but so far I only get 24-bit depth.
The reason I want 16-bit depth buffer is:
A: I don't need the precision of 24-bit.
B: I'm hoping it will be faster.
This is how I set the depth format in cocos2d:
[[CCDirector sharedDirector] setDepthBufferFormat:kDepthBuffer16];
Which at some point ends up in EAGLView (v1.3)
glGenRenderbuffersOES(1, &_depthBuffer);
glBindRenderbufferOES(GL_RENDERBUFFER_OES, _depthBuffer);
glRenderbufferStorageOES(GL_RENDERBUFFER_OES, _depthFormat, newSize.width, newSize.height);
glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES, GL_RENDERBUFFER_OES, _depthBuffer);
Where _depthFormat is GL_DEPTH_COMPONENT16_OES (verified while debugging)
And here's how I check the number of bits:
GLint depthBufferBits;
glGetIntegerv( GL_DEPTH_BITS, &depthBufferBits );
NSLog( #"Depth buffer bits: %d", depthBufferBits );
And the output is:
Depth buffer bits: 24
What am I missing? I've tried the same code on iPod Touch (2nd gen) and iPhone 3GS, always comes back as 24 bit.
Update:
I've now updated cocos2d to the latest version from the GIT repo.
Here's how I initialize the director and window:
- (void) applicationDidFinishLaunching:(UIApplication*)application
{
// Init the window
window = [[UIWindow alloc] initWithFrame:[[UIScreen mainScreen] bounds]];
[[UIApplication sharedApplication] setIdleTimerDisabled:YES];
// cocos2d will inherit these values
[window setUserInteractionEnabled:YES];
[window setMultipleTouchEnabled:YES];
[CCDirector setDirectorType:CCDirectorTypeDisplayLink];
CCDirector* director = [CCDirector sharedDirector];
[director setAnimationInterval:1.0/60];
[director setDisplayFPS:YES];
EAGLView *glView = [EAGLView viewWithFrame:[window bounds]
pixelFormat:kEAGLColorFormatRGBA8 // RGBA8 color buffer
depthFormat:GL_DEPTH_COMPONENT16_OES // 16-bit depth buffer
preserveBackbuffer:NO];
[director setOpenGLView:glView];
[director setProjection:kCCDirectorProjection2D];
[window addSubview:glView];
[window makeKeyAndVisible];
[CCTexture2D setDefaultAlphaPixelFormat:kTexture2DPixelFormat_RGBA8888];
[director runWithScene:[LevelScene scene]];
}
Still getting a 24-bit depth buffer according to glGetIntegerv( GL_DEPTH_BITS, &depthBufferBits );
I'm not sure about the internals but setDepthBufferFormat is deprecated. I can also imagine that it's simply (no longer) possible to change the depth buffer after the GL view has been created. With Cocos2D v0.99.5 beta 3 the initialization has changed quite a bit. I suggest you update to the latest version, even though it's beta.
Then take a look here: http://www.cocos2d-iphone.org/wiki/doku.php/prog_guide:setup_buffers
The important part is the initialization of the EAGLView, where you can set the depth buffer:
EAGLView *glView = [EAGLView viewWithFrame:[window bounds]
pixelFormat:kEAGLColorFormatRGBA8
depthFormat:GL_DEPTH_COMPONENT16_OES
preserveBackbuffer:NO];