why I cant choose stretched_billboard render mode in the script? - unity3d

I am trying to change particle render mode in the script but I cant choose stretched_billboard mode but it is possible to change in the editor.
ps = GetComponent<ParticleSystem>();
psr = GetComponent<ParticleSystemRenderer>();
psr.renderMode = ParticleSystemRenderMode.Stretch;
this works fine
psr.renderMode = ParticleSystemRenderMode.StretchedBillboard;
there isn't StretchedBillboard option.
Thanks in advance.

I think that StretchedBillboard is a combination of the default renderer model (Billboard) and a non-zero scaling applied to the camera. So, through code, you should be able to reproduce it with a combination of these two values.
In fact, look at the documentation, stretched billboard is explained as the billboard mode, but with the applied scaling:
Looking at the ParticleSystemRenderer's APIs, there's the cameraVelocityScale attribute (How much are the particles stretched depending on the Camera's speed.). There's also a snippet that should give a proper example.
I hope this could help you out.

Related

Pixel local position within a quad

I have this quad in the 3D scene:
I need to get the local positions of all painted (non transparent) pixels of this quad. Already tried to use GetPixels() and filter the result by the alpha value to get only pixels with a valid color in it. But then I noticed that it isn't possible to get the pixels' local positions using this method, cause it returns a Color array, which doesn't offer a way to retrieve that information. Already tried to google and nothing came up, maybe the only way to get what I want is to build something at shader level, but I don't know much about this subject either. I can offer more context to my doubt if needed, but I'm trying to keep things short here. Also, there's no code to show except for the wrong one using GetPixels(), which doesn't work for my case as far as I know.
Any help is appreciated!

Unity: skinned mesh vertexbuffer output slightly different than correct results

It’s like this hip is stuck in place, with in the pictures(being at different times in the animation since I can’t upload a gif) the red wireframe is the drawing of the raw output triangles and everything else being default unity, aka the correct output
What am I doing wrong? What am I missing? This has been driving me nuts for 2-3 days now, any help is appreciated
As you do not post any code, I will try to do some guessing over what is going on. Disclamair: I actually happen to run into similar problem myself.
First, you must know that Update's of MonoBehaviour are called in random order (not neccessery random-random, but yous see the point). If you bake mesh in one component, it can still be one-frame late to the Animator component.
There are actually two solution of this case: first is to specify the order of Script Execution Order in Project Settings, while the second is to use LateUpdate instead of Update when updating mesh after skinning.
The second problem you might have run into is scale / position of skinned mesh. Even if it does not contribute at all to the animation movement it can still wreck the mesh baking collision later on.
To fix that make sure that all your SkinnedMeshRenderers objects have Local Position, Local Rotation and Local Scale set in Transform component to identity (0,0,0 for position and rotation, 1,1,1 for scale). If not - reset those values in editor. It will not change animations, but it will change mesh generator.
If both solutions does not work, please describe the problem more thoroughly.

How do I get a Light's Range value in Shader?

I'm trying to write a simple frag/vert shader that, depending on whether it is in the range of a light, will paint the appropriate colour from either the 'lit' texture or from the 'unlit' texture.
Therefore, I need to compare the distance between the light to the range of the light.
I've been googling all kinds of things, but I can't seem to find a way of accessing the range value of the light. Is there a way to do so? If not, is there some kind of derived data I could use as an alternative?
Update
I was able to find this method here, which seems to be the most promising so far, however after playing around for a bit, I still can't seem to get what I need. There's some talk about _LightMatrix0 not being populated. Can anyone confirm?
Update 2
I found the variable unity_LightAtten in the Unity Shader Variables documentation. However, this is only used for Vertex Lit shading, which isn't exactly ideal, especially considering the lack of console support.
Could there be a way to pipe this variable to Forward Rendering?
You can pass Light.range into the shader using Material.SetFloat. You need to attach a script to do that.

Stop display through object despite transparent material?

This is very difficult to describe, but I feel like it should be possible to do.
Essentially, this is where I am at:
Since the Earth's material has outer normals, its own material is not displayed on the other side of the globe. However, due to the transparency of the material, I see the markers that exist on the other side of the globe.
How can I stop things from displaying through the material? The background is the skybox and is the only thing that I would like to show through the transparency.
Any help or advice is appreciated.
it's a little tricky to do this:
the only way is:
turn OFF each green skyscraper, when, it is at the "back".
Let's say your world is simply centered on 000.
In that case, if you think about it it is this simple...
If z>0f, then you want the green skyscraper to be invisible. If it is z<0f you want it to appear normally.
Since this is Unity you must work in an agent-like manner. So actually it's this simple. Make a script (pseudocode)
Class HideMeIfAwayFromCamera
{
Update()
{
if ( z > 0f ) renderer.enabled = true;
else renderer.enabled = false;
}
}
That's probably your simplest and best solution here. in any event, I would try that first. Let's hear how it works for you.
Consider that you may want to make the on/off point a little ahead or behind the half-way plane, try it.
Note that another approach is. You kind of need to use a cutout shader; use a different layer altogether for the skyscrapers; and use yet another layer for any skyscrapers who's base is beyond the horizon; in this way you can let it show the "tops" of any skyscrapers that are "just behind" the horizon but hide the bases. It does seem way too complicated though. I think the best result will be just turning off the rear ones, as shown above.
(Note too that it's not "physical". If the globe is transparent: you SHOULD be able to see the skyscrapers at the back. So you'd have to try some things to see what feels good.)

programming custom blendmode for Adobe Photoshop (or After Effects)

What's the easiest way to create a custom blendmode for Adobe Photoshop?
I want to be able to blend two images together according to rules that can't be created by combining existing blendmodes
(i.e. the blendresult shall be dst = backLayer + (frontlayer*2-1) which can't be achieved by applying linear dodge twice and then subtracting a white layer since clamping will occur - and when working in a 32bit workspace, the blendmodes will not behave as expected anymore)
I tried to program pbk kernels in PixelBender, but Photoshop's PixelBender doesn't seem to support pbk kernels that take more than one image as input (can work only as filter on a single image).
What's the most straight forward way to do this?
Check if the following technique works:
http://www.photoshopgurus.com/forum/general-photoshop-board/25925-custom-blending-modes-possible.html