Unity Image.FillAmount not working as wanted - unity3d

I'm working on something that relies on the fill amount of the image.
However, the fill amount may be weighted slightly? I'm unsure. Here is my evidence:
This is the amount set to 0.2
However, setting it to 0.1 shows
I would have thought that 0.1 would have shown half of 0.2. Does unity therefore actually measure fill amount from 0.1 -> 1, instead of 0 -> 1 as the documentation suggests, or am I just being stupid?

Think I know the answer to this one.
I am assuming your image may not be clipped to the edge, and you have a transparent boarder? Also ensure the image is a power of 2 when you import it, as forcing a power of 2 in the texture import settings could also introduce a boarder.
Having a board would mislead you into thinking nothing it showing up until 0.1, when infact it is showing the transparent boarder.

Related

Sample gradient range in Unity Shader Graph

I'm trying to put together a simple shader with Unity' Shader graph. The material should appear white, yellow and blue - and static. However in the sample gradient, it's only displaying yellow. If I change the time input to Sine, the gradients blends through the colours. What is going wrong.
Plug an "UV" node in the gradient sampler's "time" input instead of a "Time" node.
First thing to know is that a Gradient goes from 0 to 1.
The 0 is on the left (in white) and the 1 in on the right (in yellow).
I guess that the blue is between approx. 0.25 and 0.3.
On the other hand, you use the Time, which has a constantly evolving value far above 1. Therefore, it is always on the max, meaning plain yellow.
If you use SinTime, the value alternates with a sinusoid between 0 and 1, making the color go from left to right then from right to left.
Additional information: as of today (2019.1.12) it seems that there is a bug in Shader Graph when you use Gradient on a system where the decimal separator is ","
The generated shader is created with a "," instead of "." in functions, messing up the arguments.

Which color-key to use in a heat-map with a small range of values?

My data covers a small range, but I still like to make the small differences between the data points visible in a heat-map. What color-key is best to maximize color intensity (and not generating a greyish map) and how to set the range in pheatmap?
You didn't give enough information to give you an exact code example for your sample data, but something like the below is a way to get at the problem. In terms of what actual colours you want to use, I recommend you play around with it to see what looks best, I have just subbed in red and blue as a proxy.
pheatmap(yourdata, color = colorRampPalette(c("red", "blue"))(length(-12:12)),breaks=c(-12:12))
The length() is setting the range while the breaks=c(x:x) tells it where to make breaks. So let's say you wanted breaks every 0.2 from 0 to 1, you would modify it to be:
breaks=c(0,0.2,0.4,0.6,0.8,1.0)
You can play around with the break gradients to get something that works for your dataset.
This is my first attempt at answering a question on here, please let me know if something above does't work for you, or if you are confused by what I've written.

Scale down a larger number from 0 - 1 - 0

I'm not math savvy - but what I want to do seems to be math heavy.
I'm looking to scale down a number which ranges from 0-800 so that it is within a range of 0-1-0 (so 1 being 400). Not sure if this is possible, but my attempts at a solution have not been fruitful. Any indication as to where to look for a solution would be of great benefit!
It's so I can change the alpha of images depending on the screen location - the centre-most images should be 100% visible, whilst as it get's further to the edge of the screen, the images should become more transparent. The range for alpha is 0-1.
Kind regards, and thanks in advance!
Here is the graph of 1-abs(1-2x/a) where a=800:
Here is sin(pi*x/a):

D3 Stacked Bar Chart outer padding

I've been working on adapting the stacked bar chart example (http://bl.ocks.org/mbostock/1134768). The problem I'm having is that there's
always outerpadding. The API lists the outer padding as a 3rd option, but omitting
it or setting it to 0 still leaves some padding. In most cases, it isn't too bad,
but with large data sets it tends to be a huge amount of padding. For all the code
relevant to my issue, you can check the link above. It's not very noticeable in that
example, but the first bar isn't drawn until about 12 pixels (in larger data sets I'm using
this can be at 100 or more pixels); I want it to start at 0 pixels.
Thanks! If you need any more explanation just let me know and I'll do my best.
EDIT: After testing, it appears rangeBands() starts at 0, but I'm still not sure why the rounding
from round bands would round as much as it did. Oh well, I can deal with using rangeBands.

iPad UIColor Saturation Issues

I am trying to draw a UIColor on the screen of a view-based app, and I am trying to do so using HSB. It is absolutely necessary for me to use HSB in this case. I can create a UIColor object with any S value from 0.0f to 0.75f, but past that the numerical changes have no effect on the actual saturation displayed. I need it to be 1.0f, but it is still using 0.75f. Any ideas on why it is doing that, and how I can make it work?
Because of how it works, + (UIColor *)colorWithHue:(CGFloat)hue saturation:(CGFloat)saturation brightness:(CGFloat)brightness alpha:(CGFloat)alpha actually does not use HSBA values internally; it is simply a wrapper around the device RGB color space.
I think that under extreme cases there surely would be chances that a constant H/B/A + a .75–1 S yields colors that differ so slightly it became imperceptible, despite the color components being digitally tracked as very precise floats. As saturation drops, the number of “available” colors decreases (as the display could only show this many colors, dropping the saturation compresses the usable colors) and the chance of collision simply rises.
Given that your scenario uses H0-1, B1, A1 colors which nearly invalidates my assumption, I was curious and have made a test project; the colors however worked correctly. I’m on iOS 4 SDK GM, so maybe it’ll help if we know which SDK you’re working against.
After doing some experimentation, I've discovered what my issue was.
I was using a for loop to draw single-pixel lines across a view, each with a hue value greater than the previous one. I was doing this to create a color spectrum to be used for a color picker.
My issue arose because I was using CGContext paths, not rects, to do the drawing. Paths, by default, "straddle" the created path with pixels. Because I was setting the width to one, CoreGraphics was forced to average between pixels, creating a desaturated effect. Setting the width to two set the saturation correctly, but the gradient of the spectrum was no longer smooth.
My fix for this issue was to use rects instead of paths. They did not blend between pixels, and the saturation issue was fixed.