OpenGL ES: Is it more efficient to turn lights off and on, or just assign emissivity to my particles? - iphone

I'm working on Android, and I'm new to graphics.
I have some particles that I don't want to be affected by lighting. Right now, I am disabling lighting right before I draw the particles (gl.glDisable(GL10.GL_LIGHTING)), and then enabling them again once they're drawn, and continue drawing the rest of the scene.
Would it be more efficient to leave lighting on, and just set all the particles to be fully emissive? (by calling glMaterialfv( GL_FRONT_AND_BACK, GL_EMISSION, white) right before drawing the particles)

There's no easy answer here, I'm afraid. As the anon commenter says, you're best off measuring it yourself.
Bear in mind that the results will likely be very hardware-dependent. Disabling lighting is less work for the hardware, but the state change may disrupt the processing pipeline such that the saving is negated.

Related

Unity - how to provide diffuse lighting

I have a simple scene of the interior of a house (less roof). It does not in any way need to look realistic, just to be geometrically correct, therefore the walls and furnishings and fittings are simply constructed from primitive objects - cubes and cylinders etc.
The layout is fine, the problem is the lighting - very black shadows. The scene has the standard single directional light source.
What I need to do is provide overall diffuse lighting - equivalent to an overcast day.
I should point out that I am pretty much a novice on all this - lighting, shaders etc, though I have been reading a lot.
From what I read it appears that this is controlled by shaders, shaders being attached to materials, materials being applied to the objects. However, it doesn't seem to make much sense to me. Surely, a shader, if part of the object by virtue of being attached to the material, can only deal with how light might be reflected off the surface - but the light has to get there first.
Therefore, there must be a way of providing an overall diffuse light in the first place?
Or have I got this completely wrong? How does one get rid of the blackness on the non-illuminated side of an object? So far the only way I have found is to make the surface emit light, ie glow a bit, which surely must not be right.
Your general understanding of how this all works is correct. One way to look at it object request rendering, looks up the material, the material binds shader to a set of parameters. The shader then gets executed, once per light in the scene that affects it (this is simplyfying things but we'll get to that in a bit). This is why lights are expensive (in forward rendering that is), until optimizations start to kick in, this means rendering the scene n times.
So yes, you could just add a constatnt factor in the shader, to achieve the effect of 'ambient' or 'diffuse' lighting. But that shader, in order to support all the features like reflectivity etc, would have to be crazy complicated.
Fortunately, with unity we also get a middle layer called Standard Shader, which does pretty much all of the math underneath, and releases you from the necessity for writing shader code.
For a gentle, diffused look, you definitely want to look at baked Indirect Illumination features of Unity, maybe even lit everything with area lights only.
Its probably also a good idea to looki into light probe groups. They work with spherical harmonics, encoding only the low frequency components of the lighting data, effectively only using slow changing factors like general direction of the light.
Finaly look into reflection probes (and skyboxes while at it), theres few good free HDR probes available that will emit light into your scene (when baking lightmaps and baking lightprobes), enabling surprising realism, compared to default unity skybox.
If you don't want harsh directional light, just disable it (although it's often useful to know what is your strongest light source in your sene - even if its a skybox with some clouds, i would probably keep a scene light just to know faster if anything goes wrong

How can I use baked lighting on sprites? / How to light up a large area in 2D?

I'm having trouble figuring out how to light up large area(s) of sprites in Unity 2D. My previous knowledge on Unity's lighting is zero.
I first tried using a large amount of point lights and using the "Sprites/Diffuse" material, but about only five would actually render at a time, so I guess there's a limit on that.
Then I tried putting in an area light. That didn't do anything, so that's when I started doing research about baked lighting on sprites (and baked lighting in general). I found stuff like this but I couldn't get it to work either because it's outdated or because I don't know what I'm doing. Other answers I've come across seem to assume that the reader knows anything about lighting in Unity in the first place which, to be honest, I don't. Unity's documentation website had some information on it, but no tutorials that go into how to set up baked lighting.
I've tried a bunch of different combinations of materials (like using the "Standard" shader for the sprites instead of "Sprites/Diffuse", emission, ect.) and I enabled "Baked Global Illumination" in Lighting>Settings.
If baked lighting isn't possible on sprites (or isn't worth the trouble), what are the alternatives?
Edit: I made sure not to have the lights pointing the wrong direction, and I do realise that Unity2D is just like painting onto a piece of paper in Unity3D. I was able to get point lights to work, but only a few at a time. I don't need to do the entire screen at once, I need to do a large specific area at once.
some tips...
working with sprites your in 2d... when you add a light, switch to 3d mode, and rotate to make sure your light is pointed at your objects, and oriented so as not to be on the same plane, or level with them, as this will cast all the light behind them.
if your trying to light up everything on the screen(in camera) attach an area light to the camera at the cameras position, point it where the camera points, and then in the inspector on the right, you can change its variables. intensity, range, width, height etc.
Emissive Texture:
https://www.youtube.com/watch?v=oa6kW5HhRd4
For some reason, I never even thought about going into the asset store. I found this for free, and it looks like it will work: Light2D.

Unity3D: Why is particle lighting making directional shadows disappear?

UPDATE: As #BenHayward suspected, this is a bug. <link>
I have a very simple setup of cubes on a plane comprising a grid of quads. A directional light is shining down at the scene at an angle, producing a set of shadows from the cubes onto the quads.
Now I'm trying to produce an explosion effect with Unity's particle system, but when I add a point light to the particle system it causes all the directional-lighting shadows to disappear, whether they're in line of sight of the particle or not.
The shadows reappear when the particle is destroyed. Replicating the particle effect with pure C# doesn't cause any problems.
(Oh, and obviously I'm using the deferred rendering path.)
Any ideas? This is driving me off the wall.
[EDIT: I should have mentioned that the point light added to the particle system is set to cast shadows. The Unity standard particle pack has shadow-casting disabled by default. They too cause the problem when I turn the shadow-casting on.]
Based on the project that you linked to, it seems as though the particle system is causing the shadow cast from the directional light to flicker on and off quickly. I suspect this is a bug, since if it were intended behaviour, I wouldn't expect it to flicker in this manner.
In cases where this is not a bug, the problem can be caused by a couple of issues:
You can only have a certain number of dynamic (shadow casting) lights in your scene which are seen by the camera frustum. By default, this number is quite low (I think it's 4). You can increase this number by going to Edit > Project Settings > Quality. Set the Pixel Light count higher from its default value. You will need to increase this value to be greater than the total number of lights in your effect. Higher values will allow more lights to be rendered on the screen, but this reduces performance.
It depends on the shaders which you are using to receive the shadows. Some shaders will only render shadows for one directional light. The light which is used isn't necessarily too easy to determine. If you are using the standard Unity shader this shouldn't be a problem. But if you are using a mobile compatible surface shader or something you've written yourself then this could be the cause of the problem.
Also, for an explosion, I'd recommend using just one single point light (not lights attached to each particle), as this is all that is required. Any more lights would result in considerable performance impact on the GPU especially if there are more than one explosion in the scene at any one time.
I recreated the scene as you described, i can't recreate your issue.
i mostly followed this tutorial, and added a few cubes in a plane:
https://unity3d.com/learn/tutorials/topics/graphics/adding-lighting-particles
I will need a screenshot of your lights componnents, both the directional and the point light, the particles, and the cubes (mostly the material); I cannot comment because i dont have enough reputation yet, so i'll delete this once you add the screenshots;

How to make this lighting effect in HaxeFlixel or Unity?

How do I create this lighting effect in HaxeFlixel or Unity ?
I will tell you how it was created in this specific case. This question is very broad and there are very many ways to create lighting effects in both Unity and HaxeFlixel.
The image is of the game Beneath the City by Deepnight, accessible on his website. The game uses haxe although not with HaxeFlixel. It's deepnight's personal engine that works with the flash target. The source code is available here. The class where lighting takes place is in src/Level.hx and more specifically in the renderLights method. From what I gather, a light layer is layered above the sprites of the level. This layer (or bitmap data) has lights drawn as rectangles on it. This layer is then blurred, so that the lights don't appear as solid rectangles, but as faded blurs of spreading light. This takes place with flash blur filters. Blend modes are used to make the light Add in luminosity. A dark mask is then layered above the blur layer, presumably to prevent light in certain locations, such as in the fog of the game. (?). This all takes place between lines 208 and 248.
This game truly does have gorgous visuals, but the lighting goes beyond the initial blurred lights. Particles float around in the game that really add to the lightings aesthetic.
This is all how he does it though. How you do it is up to you. For HaxeFlixel, I would first consider alternatives such as this geometric lighting or this method of applying lighting to scenes, which looks closer to screenshot or even a very simple circle based lighting alternative. Searching Unity 2D lighting brings up plenty of options.
You've got plenty of options on how to approach the issue. I didn't answer this with a direct tutorial because the question isn't at the code level.

OpenGL ES 2D - z-ordering, depth buffer vs drawing in order

I'm completely new to OpenGL so sorry if it's a silly question. Also no idea if it makes a difference, just in case, I'm using OpenGL ES 1.1.
Currently I'm drawing sprites in order of texture, as I've read it's better for performance (makes sense). But now I'm wondering whether that was the right approach because I need certain sprites to be in front of others regardless of texture.
As far as I'm aware, my options for z-ordering would be either to enable the depth buffer and use that, or to switch the drawing order so the sprites are drawn in the order of a z value.
I've read that the depth buffer can be a performance hit, but so would changing the order. Which should I do?
The short answer is, sort the sprites.
It sounds like you're creating something that's really 2d based, and while a z-buffer can be a very useful tool, it can be an impressive performance hit if the hardware doesn't support it, and if you're not actually using 3d objects that may be intersecting one another, it doesn't make a lot of sense to me.
In addition, if you have any sprites that are partially transparent, i.e. have pixels with an alpha value that isn't 0 or 255 (or 0.0 or 1.0 if using floating point) then you have to sort anyway.
As a side note, I believe that the performance lost when changing "sprites" only occurs when switching out surfaces, and only rarely. One way to help mitigate this problem is to put as many different sprites in one image as you can, on a grid, and use little pieces of your surfaces as sprites.