Sample gradient range in Unity Shader Graph - unity3d

I'm trying to put together a simple shader with Unity' Shader graph. The material should appear white, yellow and blue - and static. However in the sample gradient, it's only displaying yellow. If I change the time input to Sine, the gradients blends through the colours. What is going wrong.

Plug an "UV" node in the gradient sampler's "time" input instead of a "Time" node.

First thing to know is that a Gradient goes from 0 to 1.
The 0 is on the left (in white) and the 1 in on the right (in yellow).
I guess that the blue is between approx. 0.25 and 0.3.
On the other hand, you use the Time, which has a constantly evolving value far above 1. Therefore, it is always on the max, meaning plain yellow.
If you use SinTime, the value alternates with a sinusoid between 0 and 1, making the color go from left to right then from right to left.
Additional information: as of today (2019.1.12) it seems that there is a bug in Shader Graph when you use Gradient on a system where the decimal separator is ","
The generated shader is created with a "," instead of "." in functions, messing up the arguments.

Related

Strange shading behaviour with normal maps in UE4

I've been having some very strange lighting behaviour in Unreal 4. In short, here's what I mean:
Fig 1, First, without any normal mapping on the bricks.
Fig 2, Now with a normal map applied, generated based on the same black-and-white brick texture.
Fig 3, The base pixel normals of the objects in question.
Fig 4, The generated normals which get applied.
Fig 5, The material node setup which produces the issue, as shown in Fig 2
As you can see, the issue occurs when using the generated HeightToNormalSmooth node. As shown, this is not an issue relating to object normals (see Fig 3) or to a badly exported normal map (as there isn't one in the traditional sense), nor is it an issue with the HeightToNormalSmooth node itself (Fig 4 shows that it generates the correct bump normals).
To be clear, the issue here is the fact that using a normal texture at all (this issue occurs across all my materials) causes the positive Y facing faces of an object to turn completely black (or it seems, to become purely reflections-based, as increasing roughness on the material causes the black faces to become less 'shiny' looking).
This is really strange, I've tested with multiple different skylight setups, sun directions, and yet this always happens (even when lit directly), but only on +Y aligned faces.
If anyone can offer insight that would be greatly appreciated.
You're subtracting what looks like 1 from the input that then goes into multiply by 1, if I'm correct. This will, in most cases, make any image return black. This is because in UE4 and many other programs, colors in an image are determined by decimals of Red Green and Blue. These decimals fall in a range of 0 to 1. This means if I wanted to make red, I could use these values- R = 1 G = 0 B = 0. This matters because if R = 0 G = 0 B = 0, the result is black. When you use a multiply node in your example, what you are doing is having UE4 take each pixel of the image you fed into the node (if it was white, R = 1 G = 1 B = 1) and multiply its R, G, and B values by that number. Since zero multiplied by a number equals zero, all the pixels in the image are being set to have values of R = 0, G = 0, and B = 0. thus, all zeros, and you get black.
I also noticed you then multiplied it by one, which in most cases won't do a whole lot, since you're just multiplying the input by 1. If your input is 0, (black), multiplying it by one won't change it, cause 0 * 1 still equals 0.
To fix your issue, try changing the value you subtract from your input to be something smaller than one, say a decimal, such as 0.6 or 0.5
So I've discovered why this was an issue. Turns out there's a little option in the material settings called 'Tangent Space Normal'. This is on by default ('for convenience'), disabling this appears to completely fix the issue with the generated normal maps produced by HeightToNormalSmooth.

Display RGB colors from two changing values

I have two values (between 0 and 255), from audio sources.
I'm trying to display two different RGB colours, a rectangle split by the middle, the colors from these two changing numbers, the colors changing as the numbers.
thank for you help, I know it's simple, but i'm really stuck.
Have a nice day.
To show a “split rectangle,” I would suggest using two panel objects side by side.
You can control the panel object’s colour using the bgcolor (or bgfillcolor) message followed by a list of four numbers, e.g.: bgcolor 1 0 0 0.5
The numbers correspond to Red, Green, Blue, and Alpha (opacity) and are all in the range of 0–1. So bgcolor 1 0 0 0.5 would set the colour of the panel to 100% red and 50% transparent.
In your case, you have values 0–255 so you need to scale those down to 0–1 and then decide how exactly you want to map them. For example, if you wanted each panel to be white when the value is 255, black when it is 0, and some shade of grey in between, you could do something like this:
value → [scale 0 255 0. 1.] → (bgcolor $1 $1 $1 1) → [panel]
Some notes:
The last two arguments of scale must be floats, otherwise it will just output integers 0 and 1.
The $1 syntax of the message box gets replaced by input. You could decide for example to only control the green part of the colour: bgcolor 0 $1 0 1.
If your value changes very frequently, you might want to limit how often it updates the colour. Most screens won’t update faster than 60 times per second, so changing the colour every millisecond for example is a waste of resources. If you’re using snapshot~ to get the value from audio, you could use an argument of 17 or higher, to limit the values to less than 60 times per second (1000ms / 60 = 16.6…). Alternatively, you could use the speedlim object to limit how often you recalculate your colour.

Is it possible to fill transparency with white in a texture in code?

I have some textures containing some transparency parts (a donut, for example, which would show a transparent center). What I want to do is fill the middle of the donut (or anything else) with a plain white, in code (I don't want to have a double of all my assets that need this tweak in one part of my game).
Is there a way to do this? Or do I really have to have 2 of each of my assets?
First it is possible to change a transparent texture to not-transparent, if it wasn't then graphic editors would be in trouble.
Solution 1 - Easy but takes repetitive editing by hand
The question you should be asking yourself is can you afford the transition at run time or would have two sets of textures be more efficient; from experience I find that the later tends to be more efficient.
Solution 2 - Extremely hard
You will need a shader that supports transparency and that it marks the sections that have to be shaded white. That is, it keeps track of which area will be later filled with white. It is implied that since your "donut" is already transparent on some parts then it already uses that texture that has an alpha, but you will have to write your own shader mask and be able to distinguish which is okay to fill white and which is not (fun problem here). What you need to do is find the condition in which that alpha no longer needs to be alpha and has to be white. Once the condition is met you can change the alpha of via the Color's alpha property. The only way I see you able to do this is if there is a pattern to the objects, so that you can apply some mathematical model to them and use that to find which area gets filled. If the objects are very different then the make two sets of textures starts to look more appealing.
Solution 3 - Medium with high re-use value
You could edit the textures to have two different colors, say pink and green. Green is the area that gets turned white and pink is always transparent. When green should not be white then it is transparent. You would have to edit your textures by hand as well.

Changes Brightness in color replacement

I am working on replacing certain color in image by User's selected color. I am using OpenCV for color replacement.
Here in short I have described from where I took help and what I got.
How to change a particular color in an image?
I have followed the step or taken basic idea from answer of above link. In correct answer of that link that guy told you only need to change hue for colour replacement.
after that I run into the issue similar like
color replacement in image for iphone application (i.e. It's good code for color replacement for those who are completely beginners)
from that issue I got the idea that I also need to change "Saturation" also.
Now I am running into issues like
"When my source image is too light(i.e. with high brightness) and I am replacing colour with some dark colour then colours looks light in replaced image instead of dark due to that it seems like Replaced colour does not match with colour using that we done replacement"
This happens because I am not considering the brightness in replacement. Here I am stuck what is the formula or idea to change brightness?
Suppose I am replacing the brightness of image with brightness of destination colour then It would look like flat replacemnt and image will lose it's actual shadow or edges.
Edit:
When I am considering the brightness of source(i.e. the pixel to be processed) in replacment then I am facing one issue. let me explain as per scenario of my application.
for example I am changing the colour of car(like whiteAngl explain) after that I am erasing few portion of the newly coloured car. Again I am doing recolour on erased portion but now what happended is colour done after erase and colour before erase doesn't match because both time I am getting different lightness because both time my pixel of to be processed is changed and due to that lightness of colour changed in output. How to overcome this issue
Any help will be appreciated
Without seeing the code you have tried, it's not easy to guess what you have done wrong. To show you with a concrete example how this is done let's change the ugly blue color of this car:
This short python script shows how we can change the color using the HSV color space:
import cv2
orig = cv2.imread("original.jpg")
hsv = cv2.cvtColor(orig, cv2.COLOR_BGR2HSV)
hsv[:,:,0] += 100
bgr = cv2.cvtColor(hsv, cv2.COLOR_HSV2BGR)
cv2.imwrite('changed.jpg', bgr)
and you get:
On wikipedia you see the hue is between 0 to 360 degrees but for the values in OpenCV see the documentation. You see I added 100 to hue of every pixel in the image. I guess you want to change the color of a portion of your image, but probably you get the idea from the above script.
Here is how to get the requested dark red car. First we get the red one:
The dark red one that I tried to keep the metallic feeling in it:
As I said, the equation you use to shift the light of the color depends on the material you want to have for the object. Here I came up with a quick and dirty equation to keep the metallic material of the car. This script produces the above dark red car image from the first light blue car image:
import cv2
orig = cv2.imread("original.jpg")
hls = cv2.cvtColor(orig, cv2.COLOR_BGR2HLS)
hls[:,:,0] += 80 # change color from blue to red, hue
for i in range(1,50): # 50 times reduce lightness
# select indices where lightness is greater than 0 (black) and less than very bright
# 220-i*2 is there to reduce lightness of bright pixel fewer number of times (than 50 times),
# so in the first iteration we don't reduce lightness of pixels which have lightness >= 200, in the second iteration we don't touch pixels with lightness >= 198 and so on
ind = (hls[:,:,1] > 0) & (hls[:,:,1] < (220-i*2))
# from the lightness of the selected pixels we subtract 1, using trick true=1 false=0
# so the selected pixels get darker
hls[:,:,1] -= ind
bgr = cv2.cvtColor(hls, cv2.COLOR_HLS2BGR)
cv2.imwrite('changed.jpg', bgr)
You are right : changing only the hue will not change the brightness at all (or very weakly due to some perceptual effects), and what you want is to change the brightness as well. And as you mentioned, setting the brightness to the target brightness will loose all pixel values (you will only see changes in saturation). So what's next ?
What you can do is to change the pixel's hue plus try to match the average lightness. To do that, just compute the average brightness B of all your pixels to be processed, and then multiply all your brightness values by Bt/B where Bt is the brightness of your target color.
Doing that will both match the hue (due to the first step) and the brightness due to the second step, while preserving the edges (because you only modified the average brightness).
This is a special case of histogram matching, where here, your target histogram has a single value (the target color) so only the mean can be matched in a reasonable way.
And if you're looking for a "credible source" as stated in your bounty request, I am a postdoc at Harvard and will be presenting a paper on color histogram matching at Siggraph this year ;) .

Pixel color matching estimate

For image scanning purposes, I'd like a pixel (which I can get from a UIImage) to match (for a certain percentage) to a pre-set color.
Say pink. When I scan the image for pixels that are pink, I want a function to return a percentage of how much the RGB value in the pixel looks like my pre-set RGB value. This way I'd like all (well, most) pink pixels to become 'visible' to me, not just exact matches.
Is anyone familiar with such an approach? How would you do something like this?
Thanks in advance.
UPDATE: thank you all for your answers so far. I accepted the answer from Damien Pollet because it helped me further and I came to the conclusion that calculating the vector difference between two RGB colors does it perfectly for me (at this moment). It might need some tweaking over time but for now I use the following (in objective c):
float difference = pow( pow((red1 - red2), 2) + pow((green1 - green2), 2) + pow((blue1 - blue2), 2), 0.5 );
If this difference is below 85, I accept the color as my target color. Since my algorithm needs no precision, I'm ok with this solution :)
UPDATE 2: on my search for more I found the following URL which might be quite (understatement) useful for you if you are looking for something similar.
http://www.sunsetlakesoftware.com/2010/10/22/gpu-accelerated-video-processing-mac-and-ios
I would say just compute the vector difference to your target color, and check that it's norm is less than some threshold. I suspect some color spaces are better than others at this, maybe HSL or L*ab, since they separate the brightness from the color hue itself, and so might represent a small perceptual difference by a smaller color vector...
Also, see this related question
Scientific answer: You should convert both colors to the LAB color space and calculate the euclidian distance there. That value is also called deltaE.
The LAB space was developed (using test persons) for exactly that reaason: so that different color pairs with equal distances in tnis space correspond to equal perceived color differences.
However, it sounds like you are not looking for matching a specific color, but rather a color range (lets say all skin tones). That might require more user input than just a reference color + a deltaE tollerance:
a reference color with 3 tollerances for hue, saturation and brightness
a cloud of refence color samples
...