There is SKShapeNode but it can't be used in conjunction with masks.
Is there an equivalent to UIView's -drawRect: or CALayer's -renderInContext: in Sprite Kit? I looked at SKSpriteNode. Don't see a way to perform custom drawing with Core Graphics functions.
How would you do it?
Nope. There's no custom drawing in Sprite Kit in the likes of custom OpenGL or Core Graphics.
You can create textures using Core Graphics methods and filter textures with CIFilter though, in case that helps.
Related
I am implementing a paint program in SpriteKit and I am super impressed by the simplicity of implementation SprikeKit enables. Very cool indeed!
Each brush stroke is implemented as a SKSpriteNode with attached SKShader. Everything works as I expect.
Issue: Currently I am steadily growing the SKScene graph as brush strokes are appended. This is not a scalable solution as resource consumption mushrooms and framerate slows to a crawl. So, what I want to do "bake" the SKScene graph after brush stroke is painted.
I tried setting myScene.shouldRasterize = YES on my SKScene instance but it appeared to have no effect. How do I "bake" rendering results in SpriteKit? Do I have to roll my own?
Thanks,
Doug
The only was I can think of to do this is to call textureFromNode on your SKView passing in the SKNode that you want to "bake". Then take that texture, apply it to a SKSpriteNode, and remove the SKNode(s) that you just "baked" in to the texture.
Wanted to add a comment but reputation won't allow me. Just would like to add that you can rasterize group of sprites by using SKEffectNode. Just add any sprite to this node then use the same shouldRasterize property. You can even apply CoreImage filters to it such as Gaussian Blurs.
The downside is obviously performance when created the rasterized node.
How to use particle effects in view based application?
I have created a game using view based application and I want to use a particle effects in my game. But, I don't have any idea about using a particle in view based application.
Please give some/any idea.
In generally... you can't. Use fullscreen GL for regular particle effect.
Particle effect requires massive count of sprite drawing and alpha blending. You can do this with GL, however, a GL view cannot be overlay over other UIViews.
Normal UIView is implemented with backing CALayer. This is a kind of GL drawing, but optimized for smooth animation of low density UI, not for massive drawing count. So it's performance is too low and unacceptable for particle effect.
I tested CALayer based particle, and 128 particles were max fps with meaningful fps in 3GS.
How big of a difference is the description language of Quartz2d to OpenGL ES?
It seems they are similar in description power... except that Quartz is mostly 2d and that OpenGL is out of the box 3d ( but can be made 2d focused ).
Are the mappings from 2dQuartz to 2d OpenGL ES that different? Im sure there must be differences in some specific features that might be handled differently on one vs another... but to do a translator?
Anyone have experience with both OpenGL and Quartz2d have some insights?
Quartz and OpenGL ES are two completely different animals. While they both have a C-based API that deals with a state machine and that draws into a context, their purposes are dissimilar. In Quartz you specify lines, Bezier and quadratic curves, arcs, or rectangles, as well as fills, gradients, and shadows / glows. In OpenGL ES, you provide vertices, raster textures, and lighting information, from which a scene is generated.
They are both useful in particular cases. You might draw a 2-D static element using Quartz, into a view, layer, or texture, and then place and move that view or layer in 3-D space using Core Animation or do the same for a texture using OpenGL ES.
Rather than try to overlay one API on the other, use whichever is more appropriate for what you are doing, or look to a framework like cocos2d which lets you build and animate 2-D scenes or Core Animation where you can do Quartz drawing into a layer but still use a nicely abstracted API for moving these layers around.
I would like to draw 2d shapes like this in an iPhone app:
alt text http://www.shaggyfrog.com/junk/beveled-circle.jpg
I asked a similar question here to see if I could do it easily with Quartz, but didn't get a solution. So I got to thinking that I might be able to leverage an exsiting 2d library out there, and then I thought of cocos2d.
The goal is to draw these kinds of beveled shapes dynamically, i.e., using arbitrary colours, and possibly with the highlight/bevel drawn at an arbitrary position.
Is this possible with cocos2d?
As far as my knowledge of cocos2d goes, cocos2d will not enable you to do this in any other way than OpenGL would allow you to do. Cocos2d uses OpenGL under the hood. Cocos2d comes with no built-in set for creating such graphics.
Since the bevel is used to create a 3D effect, perhaps you shouldn't be looking at simulating it with 2D drawing, but instead use a 3D drawing library? OpenGL would certainly be capable of drawing such shapes. Cocos2d focuses on 2D drawing instead of 3D.
I'm not sure if Cocos2D would allow for a custom object to draw 3D using the underlying OpenGL mechanism. I have never tried.
Wouldn't it instead be easier to create the image in photoshop and adjust colors dynamicly? I'm not sure what you are trying to do.
You could also create a mask shape with a transparent "bevel effect" and scale that along with the image you need to have shine?
Aside from the bevel effect, if you want to "colorize" each semi-circle, you can use [sprite setColor:] or sprite.color = ccc3(r,g,b)
CCSprite *sprite = [CCSprite spriteWithSpriteSheet:sheet rect:CGRectMake(32 * idx,0,128,32)];
[sprite setColor:ccc3(CCRANDOM_0_1()*255,CCRANDOM_0_1()*255,CCRANDOM_0_1()*255)];
You would design a "white semicircle" with beveled (gray) edges. Then you can make sprites and color each separately.
I was reading that OpenGL ES can work together with Core Animation. So I wonder if I can re-use some of my hard-worked knowledge on Core Animation when I start doing OpenGL ES stuff...
You're pretty much on your own when you decide to use OpenGL. The only thing you're going to use your EAGLView for is to get the touch events.
Now the Quartz stuff is still useful - to draw 2d graphics at runtime and display them in your OpenGL code as textures. But Core Animation isn't useful at all.
EDIT: To expand on that slightly, you can use Core Animation and the UIView methods to do all the normal stuff to an EAGLView, but you shouldn't, the EAGLView should not have any UIViews overlaping it either. Both will kill performance.