Problem animating a Parallax Layer in Cocos2d-iPhone - iphone

I'm trying to implement my own parallax scrolling in Cocos2d since the built in parallax functionality isn't exactly what I'm looking for (it scales the layers, which is not what I want)
So, my own Parallax Layers overwrite the Layers setPosition method like this
- (void) setPosition:(CGPoint)point
{
[super setPosition:CGPointMake(point.x / pRatio, point.y / pRatio)];
}
pRatio is an int that reduces the amount the layer moves by, thus creating the kind of Parallax effect I'm looking for. And it works perfectly like this.
However, on occasion I need to animate a Layer to a specific position using MoveTo, so I created this method which responds to an NSNotiification
- (void) gotoStartPosition:(NSNotification *)notification
{
CGPoint startPos = CGPointFromString([notification object]);
id moveToStart = [MoveTo actionWithDuration:1 position:startPos];
[self runAction:moveToStart];
}
Which works until I set the pRatio to anything above 1. If the pRatio is higher than 1, then gotoStartPosition makes the layer jump to CGPoint(0,0) and animates into position from there.
I can't figure out at what point the Layers position is being set to 0,0, and I don't understand why it's only doing this when pRatio is higher than 1.
I suspect this is going to be something painfully simple, but I think my brain is fried. Any help, greatly appreciated.

UPDATE: I'm a doofus. What's happening is obvious. The Animation first sets the layers position to it's current position. But I've told the layer to divide any setPosition coordinates by pRatio. So it's not going to CGPoint(0,0) but something like CGPoint(0.032, 0). I've fixed it like this:
- (void) setPosition:(CGPoint)point
{
if(!animating){
[super setPosition:CGPointMake(round(point.x / pRatio), round(point.y / pRatio))];
}else{
[super setPosition:point];
}
}
- (void) gotoStartPosition:(NSNotification *)notification
{
animating = YES;
CGPoint startPos = CGPointFromString([notification object]);
CGPoint pStart = CGPointMake(startPos.x / pRatio, startPos.y / pRatio);
id moveToStart = [MoveTo actionWithDuration:1 position:pStart];
id endAnim = [CallFunc actionWithTarget:self selector:#selector(endAnimation)];
[self runAction:[Sequence actions: moveToStart, endAnim, nil]];
}
- (void) endAnimation
{
animating = NO;
}
Which isn't elegant, but it works. I'm closing this question.

Related

Adding a Hud Layer to Scene Cocos2d-3

To keep it simple, what is the easiest way to make the default [ Menu ] in default HelloWorld Scene (for example) as it's own layer. Issue I'm having now is that the scene is completely black, with nothing showing up!
GameLayer node:
- (id)init
{
// Enable touch handling on scene node
self.userInteractionEnabled = YES;
self.theMap = [CCTiledMap tiledMapWithFile:#"AftermathRpg.tmx"];
self.contentSize = theMap.contentSize;
self.metaLayer = [theMap layerNamed:#"Meta"];
metaLayer.visible = NO;
CCTiledMapObjectGroup *objects = [theMap objectGroupNamed:#"mainChar"];
NSMutableDictionary *startPoint = [objects objectNamed:#"startPosition"];
int x = [[startPoint valueForKey:#"x"] intValue];
int y = [[startPoint valueForKey:#"y"] intValue];
self.mainChar = [CCSprite spriteWithImageNamed:#"mainChar.png"];
mainChar.position = ccp(x,y);
[self addChild:mainChar];
[self addChild:theMap z:-1];
[self setCenterOfScreen: mainChar.position];
return self;
}
HudLayer node
-(id)init
{
CGSize winSize = [[CCDirector sharedDirector] viewSize];
CCButton *backButton = [CCButton buttonWithTitle:#"[ Menu ]" fontName:#"Verdana-Bold" fontSize:18.0f];
backButton.position = ccp(0.85f * winSize.width, 0.95f * winSize.height);
[backButton setTarget:self selector:#selector(onBackClicked:)];
[self addChild:backButton];
return self;
}
Scene
+ (GameScene *)scene
{
return [[self alloc] init];
}
- (id)init
{
// Apple recommend assigning self with supers return value
self = [super init];
if (!self) return(nil);
CGSize winSize = [CCDirector sharedDirector].viewSize;
self.gameLayer = [GameLayer node];
[self addChild:gameLayer z:-1];
//self.contentSize = self.gameLayer.contentSize;
hudLayer = [HudLayer node];
hudLayer.position = ccp(winSize.width * 0.9, winSize.height * 0.9);
[self addChild:hudLayer z:1];
return self;
}
From the OP I take that you have two issues with one being that the HUD is not static (i.e. it is moving as your map moves which you don't want) and that it is not positioning at the top of the screen.
Looking at the position issue first, your position is set to normalized. Since the scene's content size has been made to be the size of your map, which I take is larger than your screen, then this is why it is showing up at the top right of the map and not the screen. To fix this don't do normalized positioning. If you want to be able to still express the position in the 0 to 1 range, use (remove the line that sets the position type to normalized also):
CGSize winSize = [[CCDirector sharedDirector] viewSize];
backButton.position = ccp(0.85f * winSize.width, 0.95f * winSize.height);
If your map is 10,000 x 10,000 then using the normalized positioning like you are will set the button to (8,500, 9,500) rather than the top of the screen.
Looking at the static issue next, from the looks of it you have the Hello World scene that you are adding everything to right? It also looks like you are moving the Hello World scene with a call to:
[self setCenterOfScreen: player.position];
What you want to do instead is this, you first have a scene:
HelloWorldScene* scene;
And to this scene you are adding two "main" layers with one being your gameplay layers as children of a main gameplay layer and the other being your HUD layer, which for example could look something like:
GameplayLayer* gameLayer;
HudLayer* hudLayer;
[scene addChild:gameLayer];
[scene addChild:hudLayer];
When the player moves (or camera or whatever), what should be moving is the game's layer, not the root Hello World scene. Moving the root scene will move all of its children, which includes the hud. That is not what you want.
When I worked on, for example, the Goldfish Mysteries app (https://itunes.apple.com/us/app/finn-friends-mysteries/id740040227?mt=8) I had essentially layers for:
Story (comprised of multiple sub-layers like bg, characters, etc)
Text (highlighted text that plays along with the narration audio)
HUD (comprised of multiple sub-layers)
Whenever there is movement on the story level it occurs on the story layer. When the HUD appears then the text and story layers are paused recursively (i.e. them and all of their children) along with the narration audio if any is playing but the HUD layer remains untouched. Resuming consists of resuming story, text, and any playing narration. I don't remember off the top if dropping the hud moved down the story and text layers in this specific app since I don't have my iPad in front of me, but i have done apps in the past where dropping the hud shifted all other layers. In that situation and for a simple app it would be fine to move the scene since the scene would only be shifted enough to show the TOC (in this type of app for example). What you look to be doing is moving the entire scene with the player's movement, which is not what you really wanted to do by the looks of it.
Either way you want a clear separation between layers and operations that are meant to only happen on specific layers should be directed only towards those layers.
Hope this helped.
UPDATE (Edited):
Based on your new edited OP, you have a new problem that you've introduced. For the hud you shouldn't have to set the layer's position since inside the hud layer everything is already being laid out relative to the screen anyways. So what you should have instead is:
hudLayer = [HudLayer node];
[self addChild:hudLayer z:1];
The second issue is that you are not properly writing your init methods for the game and hud layer classes. The init should look like:
- (instanceType)init
{
self = [super init];
if (self)
{
// Do your init stuff...
}
return self;
}
You are never calling:
self = [super init];

Rotate UIView while dragging

Using a UIPanGestureRecognizer, I'm allowing the user to drag a UIView. What I now want to accomplish is getting the view to adjust its rotation around a specified point DURING dragging.
Examples of this can be found in the Tinder App (when dragging a portrait the images rotate slightly) or in the Path App (when dragging a friends popup up or down it rotates to the side grabbed on).
I must admit that I have very little experience working with UIPanGestureRecognizer, but I assume that it won't affect the answer.
Nevertheless one solution could be to create a timer and make it run while the UIView is being dragged, the timer action would then make sure to update the rotation of the UIView properly and as often as possible.
Another solution could be to subclass UIView and overwrite the drag-function (I do not know what it is called) and simply make it self adjust its rotation if any pivot point is given.
Do the rotation in the following method
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
self.yourView.transform = CGAffineTransformMakeRotation(CGFloat angle);
}
Use CATransform3D
Add
#define DEGREES_TO_RADIANS(d) (d * M_PI / 180)
in .pch file
CATransform3D myTransform = CATransform3DIdentity;
myTransform.m34 = 1.0 / -500;
myTransform = CATransform3DRotate(myTransform, DEGREES_TO_RADIANS(90), 0.0f, 0.0f, 1.0f);
myView.layer.transform = myTransform;
you can go on changing the angle here DEGREES_TO_RADIANS(90) Hope this will help you

iOS: CGContextRef drawRect does not agree with input

Background : I would like to draw blocks when the user touch up somewhere. If the block is there, I want to erase it. I manage the blocks by using NSMutableArrayto keep track of points where the block should go. Every time user touches, it will determine if the touch place already contained a block or not and manage the array accordingly.
Problem : I got a very weird feedback from this. First of all, everything in the array works as I wanted. The problem comes when the user wanted to erase a block. While the array is maintained correctly, the drawing seems to ignore the change in the array. It will not remove anything but the last dot. And even that flashes toggles on and off when the user clicked elsewhere.
Here is the code :
- (void)drawRect:(CGRect)rect
{
NSLog(#"drawrect current array %#",pointArray);
for (NSValue *pointValue in pointArray){
CGPoint point = [pointValue CGPointValue];
[self drawSquareAt:point];
}
}
- (void) drawSquareAt:(CGPoint) point{
float x = point.x * scale;
float y = point.y * scale;
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextMoveToPoint(context, x, y);
CGContextAddLineToPoint(context, x+scale, y);
CGContextAddLineToPoint(context, x+scale, y+scale);
CGContextAddLineToPoint(context, x, y+scale);
CGContextAddLineToPoint(context, x, y);
CGContextSetFillColorWithColor(context, [UIColor darkGrayColor].CGColor);
CGContextFillPath(context);
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *aTouch = [touches anyObject];
CGPoint point = [aTouch locationInView:self];
point = CGPointMake( (int) (point.x/scale), (int) (point.y/scale));
NSLog(#"Touched at %#", [NSArray arrayWithObject: [NSValue valueWithCGPoint:point]]);
NSValue *pointValue = [NSValue valueWithCGPoint:point];
int i = [pointArray indexOfObject:pointValue];
NSLog(#"Index at %i",i);
if (i < [pointArray count]){
[pointArray removeObjectAtIndex:i];
NSLog(#"remove");
}else {
[pointArray addObject:pointValue];
NSLog(#"add");
}
NSLog(#"Current array : %#", pointArray);
[self setNeedsDisplay];
}
scale is defined as 16.
pointArray is a member variable of the view.
To Test : You can drop this into any UIView and add that to the viewController to see the effect.
Question : How do I get the drawing to agree with the array?
Update + Explanation: I am aware of the cost of this approach but it is only created for me to get a quick figure. It will not be used in the real application, thus, please do not get hung up about how expensive it is. I only created this capability to get a value in NSString (#"1,3,5,1,2,6,2,5,5,...") of a figure I draw. This will become more efficient when I am actually using it with no redrawing. please stick to the question asked. Thank you.
I don't see anywhere where you are actually clearing what you drew previously. Unless you explicitly clear (such as by filling with UIRectFill() - which, as an aside, is a more convenient way to draw rectangles than filling an explicit path), Quartz is going to just draw over your old content, which will cause unexpected behavior on attempts at erasure.
So... what happens if you put at the beginning of -drawRect::
[[UIColor whiteColor] setFill]; // Or whatever your background color is
UIRectFill([self bounds]);
(This is of course horrendously inefficient, but per your comment, I am disregarding that fact.)
(As a separate aside, you probably should wrap your drawing code in a CGContextSaveGState()/CGContextRestoreGState() pair to avoid tainting the graphics context of any calling code.)
EDIT: I always forget about this property since I usually want to draw more complex backgrounds anyway, but you can likely achieve similar results by setting clearsContextBeforeDrawing:YES on the UIView.
This approach seems a little weird to me because every time the touchesEnded method is called you need to redraw (which is an expensive operation) and also need keep track of the squares. I suggest you subclass an UIView and implement the drawRect: method, so the view knows how to draw itself and implement the touchesEnded method in your view controller, where you can check if you have touched a squareView then remove it from view controller's view otherwise create a squareView and add it as subview to the view controller's view.

Circle to circle collision reaction - Equal priority

I am creating an app where there will be ~10 circles on screen, that can all be pushed and dragged around the screen via touch movements. These circles (which are simply UIView's with a circle drawn in them) also have collision detection on them, and the idea is that they push other circles out of the way when they collide. There is also no acceration or gravity in this app, it's more like pushing coins around, where they are simply pushed out of the way, and when you stop moving, so do they.
I have touch-to-circle interaction working, and the fundamentals of circle-to-circle working. The code however gives Circle01 all the power. IE, it can be pushed into other circles and they will move out of the way as hoped. But when the other circles are pushed into it, they move around Circle01, instead of pushing it out of the way.
I've read dozens of pages about collision detection, but I'm just stumped as to how I'd modify this code to give all the circles equal power.
Relevant code:
MyCircle.m
- (void)drawRect:(CGRect)rect
{
CGPoint viewCenter = CGPointMake(50,50);
bpath = [UIBezierPath bezierPathWithArcCenter:viewCenter radius:50 startAngle:0 endAngle:DEGREES_TO_RADIANS(360) clockwise:YES];
[[UIColor blueColor] setFill];
[bpath fill];
}
CircleVC.m
- (void)viewDidLoad
{
Circle01 = [[MyCircle alloc] initWithFrame:CGRectMake(200, 200, 100, 100)];
Circle01.backgroundColor = [UIColor redColor];
Circle01.alpha = 0.5;
[self.view addSubview:Circle01];
}
-(void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
// Testing circle collision detection
for (UIView *aView in self.view.subviews)
{
if (Circle01 == anotherView)
continue;
if (CGRectIntersectsRect(Circle01.frame, aView.frame))
{
float xDistance = aView.center.x - Circle01.center.x;
float yDistance = aView.center.y - Circle01.center.y;
float distance = xDistance * xDistance + yDistance * yDistance;
float angle = atan2(yDistance, xDistance);
CGPoint newCenter;
if (distance < 100 * 100)
{
newCenter.x = Circle01.center.x + (Circle01.frame.size.width * cos(angle));
newCenter.y = Circle01.center.y + (Circle01.frame.size.width * sin(angle));
aView.center = newCenter;
}
}
}
}
Thanks for any help.
It may be easier to offload this to a physics engine such as Box2d. Cocos2d is packaged with Box2d, but you will want the stand alone version since you are only using UIView based animations
_
Option 1
Here is a good, though slightly dated tutorial on using Box2d with UIKit.
http://www.cocoanetics.com/2010/05/physics-101-uikit-app-with-box2d-for-gravity/
Some caveats: if you are using ARC you will need to use the god-awful bridging compiler hints that tell the ARC pre-compiler when your C objects are being put into and out of its control.
For example:
bodyDef.userData = (__bridge void *)physicalView;
UIView *oneView = (__bridge UIView*)b -> GetUserData();
_
Option 2
You can use Chipmunk (which would also require the above "god-awful" bridging to ARC statements). Here is a good tutorial: http://www.alexandre-gomes.com/articles/chipmunk/
_
Option 3 - Easiest (Not Cheapest)
Alternatively, you could use the Objective-Chipmunk library that comes with Chipmunk Pro which does work with ARC. Here is a recent and "to-the-point" tutorial on using Obj-Chipmunk with UIImageViews. This would be the same for any UIView based code. http://www.ayarsanimation.com/frankayars/chipmunk
You can buy Obj-Chipmunk costs $89 when you by an Indie version license for Chipmunk Pro.
http://chipmunk-physics.net/chipmunkPro.php
I am not advocating a paid solution, but it may save you time.
It seems pretty simple, unless I'm missing something. Instead of hard-coding Circle01 in your touchesMoved, do your hit detection based on the view that the user touched. That way, whatever view they are dragging will get priority.

Cocos2d - how to make individual particles follow the layer, not the emitter?

I have a CCSprite and a CCParticleSystemQuad that are both children of the CCLayer. In my update method, I set the emitter's position to that of the sprite, so it tracks the sprite around. The smoke puff fall out the bottom of the sprite like you'd expect and even though you move the sprite around, the smoke appears to be part of the background layer.
The problem come if I match up their rotations. Now, for example if my sprite is rocking back and forth, the puffs of smoke swing in an arc and appear attached to the sprite.
How can I make the puffs of smoke continue along the parent layer in a straight line and not rotate with the sprite? They don't translate with the sprite when I move it, so why do they rotate with it?
EDIT: adding code...
- (id)init
{
if (!(self = [super init])) return nil;
self.isTouchEnabled = YES;
CGSize screenSize = [[CCDirector sharedDirector] winSize];
sprite = [CCSprite spriteWithFile:#"Icon.png"]; // declared in the header
[sprite setPosition:ccp(screenSize.width/2, screenSize.height/2)];
[self addChild:sprite];
id repeatAction = [CCRepeatForever actionWithAction:
[CCSequence actions:
[CCRotateTo actionWithDuration:0.3f angle:-45.0f],
[CCRotateTo actionWithDuration:0.6f angle:45.0f],
[CCRotateTo actionWithDuration:0.3f angle:0.0f],
nil]];
[sprite runAction:repeatAction];
emitter = [[CCParticleSystemQuad particleWithFile:#"jetpack_smoke.plist"] retain]; // declared in the header - the particle was made in Particle Designer
[emitter setPosition:sprite.position];
[emitter setPositionType:kCCPositionTypeFree]; // ...Free and ...Relative seem to behave the same.
[emitter stopSystem];
[self addChild:emitter];
[self scheduleUpdate];
return self;
}
- (void)update:(ccTime)dt
{
[emitter setPosition:ccp(sprite.position.x, sprite.position.y-sprite.contentSize.height/2)];
[emitter setRotation:[sprite rotation]]; // if you comment this out, it works as expected.
}
// there are touches methods to just move the sprite to where the touch is, and to start the emitter when touches began and to stop it when touches end.
I found the answer on a different site - www.raywenderlich.com
I don't know why this is true, but it seems that CCParticleSystems don't like to be rotated while you move them around. They don't mind changing their angle property. Actually, there may be cases where you want that behavior.
Anyway I made a method that adjusts the emitter's angle property and it works fine. It takes your touch location and scales the y component to be the angle.
- (void)updateAngle:(CGPoint)location
{
float width = [[CCDirector sharedDirector] winSize].width;
float angle = location.x / width * 360.0f;
CCLOG(#"angle = %1.1f", angle);
[smoke_emitter setAngle:angle]; // I added both smoke and fire!
[fire_emitter setAngle:angle];
// [emitter setRotation:angle]; // this doesn't work
}
CCSprite's anchorPoint is {0.5f, 0.5f), while the emitter descends directly from CCNode, which has an anchorPoint of {0.0f, 0.0f}. Try setting the emitter's anchorPoint to match the CCSprite's.