I am learning cocos2d for iPhone and came across Munch Time (see also the company website page), notably game of the week this week :). I am an indie developer, so won't be able to achieve the same great quality, but would like to understand how they managed to do the light effect of the moon and start that can be viewed in this video.
What do you think? I am planning to stick with cocos2d 1.x as it is compatible with a wider range of iPhones rather than cocos2d 2.x unless using the latter is the only way to get those effects (or unless you reckon that achieving the quality of Munch time is not possible with Cocos2d).
Thanks for reading..
You can use the stable release on Cocos2d which is 1.0.1 as of this post.
The lighting effects you are referring to are likely done with the images. The moon is solid, but has alpha fade on the 'light' around it, meaning it starts mostly white near the edge of the moon, but fades in transparency as it gets away from the moon edge to give the glow effect.
The clouds also look to be transparent somewhat, to allow the moonlight to slightly seep through.
However, all the animations, effects, etc, should all be reproducible in Cocos2d.
Related
Facebook released a demo video of their Surround 360 technology a few days ago, called "Here and Now": https://www.facebook.com/facebook/videos/10154659446236729/
Apparently they are using their proposed cubic mapping perspective for this. Can someone familiar verify that?
When on my Gear VR and I rotate my head, I notice a slight quality improvement. So also, does anybody know if they are using an adaptive view-aware streaming such as DASH or something for that (which will be impressive)? I am assuming it is not first downloaded and played, so maybe this is not due to the rendering.
Facebook uses pyramid encoding. They put a sphere inside a pyramid so that the base of the pyramid is the full-resolution FOV and the sides of the pyramid gradually decrease in quality until they reach a point directly opposite from the viewport, behind the viewer. That explains why, when you turned your head with the GearVR on, you noticed a quality change. They don't use MPEG-DASH, yet.
https://code.facebook.com/posts/1126354007399553/next-generation-video-encoding-techniques-for-360-video-and-vr/
What i'm trying to do is, with and augmented reality app with vuforia, i want to obscure the whole scene to give a "night effect"(The real camera texture has to be obscure too).I been trying this for a couple days now.
I searched for it and only found help in night vision effects, wich give me the idea on how to approach this problem.I'm been trying it with a shader linked to the camera.
How can i do a night effect(Just obscure the whole scene, it would be better if i could obscure only desired objects) with a shader?
Any other idea on how to do this would be great too.
An obvious approach would be to use a dark-colored and thick fog. And turn the lights down, of course. Beyond that, your query is a bit vague. But try the fog.
I'm trying to do a little research for my next game that I'm planning to make and just wondering if anyone could give me a little direction.
I'd like to use sprites to show animation, characters walking and such so my question is this. What game engine handles sprites the best?
And how many sprites can be shown per second? Say i had a character walking and wanted it to look pretty fluid, might i be able to get 60fps? or is that way way to high?
Last question.. sorry! If a sprite has more colors and complexity, but is the same file size as something simpler would it take more processing power to display the complex one?
thanks!
James.
I would highly recommend cocos2d for sprite animations. It's very easy to pick up if you already know objective-c. And it's great for working with sprites. The animations are very fluid and when your testing your applications in an iOS simulator, it tells you the frames per second in the bottom left hand corner. The frames per second usually runs at about 60. And regarding the sprite file size, I believe if the file size is the same between two sprites then they require the same amount of processing power.
A good engine to use for it's simplicity is the Sparrow engine for sprites, sound and other things. There is no reason you can't get 60fps. As for your last question, it wouldn't make a difference.
I want to create an iPhone/iPod game. I want it to be 3d, but I want to use sprites instead of OpenGL to do this (I assume this would be easier since I don't know OpenGL).
I was thinking of simply layering sprites over top of each other and changing their size to give an illusion of 3d. It doesn't need to be too convincing since the game will be somewhat cartoony.
How can I use sprites as an alternative to OpenGL on the iPhone platform?
You can use CoreAnimation for this. Either using UIImageViews (or plain UIViews) or CALayers.
It's usually the best choice for some types of 2d games (board games, for example), since animation, rotation and scaling are really easy. Just keep in mind that if performance is your concern, OpenGL will always be better.
Depending on how much 3d, I'd recommend taking a look at cocos2d. It supports multiple layers, sprites, animations, etc, but is pretty straightforward to pick up & learn. (Much easier than OpenGL to me) The example code is very comprehensive.
http://code.google.com/p/cocos2d-iphone/
I have built a game using core animation with upto about 17 - 20 objects floating about the screen scaling and rotating and performance was fine on the iPhone (make sure you check regularly on the iPhone as the simulator doesnt simulate iPhone memory or CPU speed).
CoreAnimation is pretty simple and really powerful. Use PNG's for images and I don't think you will have to many issues. The real killer of this will be alpha's in your images, this is hard work for the iPhone. So the less transparency you have the better you app will go.
In addition to Marco's answer I want to add: Not using OpenGL may also tax the device battery a little more. As I understand it, OpenGL ES can be more efficient on a device power supply (if implemented properly). Of course, this depends on how much animation is going to be used with UIImageView, UIView or CALayers, etc.
I'm sure there is a tipping point.
OK, I'm still brand new to iPhone development. I have a free game on the app store, Winner Pong, but it's just a Pong clone (who would've guessed) that uses the standard UIImageViews for the sprites. Now I want to do something a little more complicated, and port my game for the Xbox 360, Trippin Alien, to the iPhone. I obviously can't keep using UIImageViews, so I was wondering which would be better to learn: the simpler, but performance-hindering Qurtz2D, or the smooth-running but dauntingly complex OpenGL ES.
My game is basically a copter game, with about 8-10 sprites on screen plus a simple particle system (video here). Not too complicated, but performance does matter. My only prior game programming experience is in Microsoft's XNA and C#, which has a built in SpriteBatch framework that makes it incredibly easy to draw, scale, and rotate pre-rendered sprites on screen. Is it worth it to learn OpenGL ES? How big is the performance gap? Is quartz really that simple?
Also, if anyone knows of any tutorials for either one, please, post them here. I need as much help as I can get.
Look through code samples of each to actually see the complexity. You might find that OpenGL isn't so daunting.
Regarding the performance. Core Animation, which Quartz2d is part of, uses OpenGL behind the covers, so for simple sprite animations, I would expect your game to perform fairly well.
I would also glance over the programming guide for each before making your final decision.
Another alternative is to use something like Unity. I recently just started playing around with the trial version of this development environment and if you're mostly doing game development with graphical objects and sprites, this may be one option to consider. You can script in C#, Javascript, or Boo. The development environment allows you to graphically setup your scenes and levels. You can then attach scripts to graphical objects for animation to handle user events, etc.
One downside for Unity, which I've heard from others is that if you want to use the familiar UI controls from UIKit, it's not so easy to instantiate them...I haven't verified this myself.