Broken outline image in Unity - unity3d

I want to put an outline around images but on some of the images the outline is broken. Notice the corners in the attached image.
I'm using 5 as the X and Y on the outline. In addition if I use a color besides black, the outline seems to be trying to blend. For example if I set the color to EEC209FF. See 2nd image.

So, i'm trying to explain both of your questions:
Why is the outline shape broken?
That’s the way the default Unity Outline script works: It just draws your UI.Image 4 time with the defined offset behind your image, it 4 different directions, which is cheap performance trick that fits most needs, but obviously not yours.
You can take a look at the Outline script on bitbucket here.
Why is the outline color partially yellow?
Unitys Outline Script draws your image in the background and multiplies the color values with your custom color.
That’s why the white part of your image on the upper right edge become yellow in the outline (almost the color you definied EEC209FF).
The outline on the lower right stays greens since it the take the colors from your image edge, approximately 19B754 multiplied with EEC209 to 178B03, which is a slightly dark green.

Related

Unity 2D display "flash effect" above tiles

I want to apply a brighten effect above my scene.
My scene contains tiles, and I want to perform white flash for a few frames by code.
I have already tried this code:
private Tilemap tm;
...
tm = GetComponent<Tilemap>();
tm.color = new Color(0.5f,1.0f,1.0f,1.0f);
This code darkens the scene by a certain color amount, but I wish to brighten it.
Your code is not working because in Unity if you render an image (in your case tile) the original color of the image is when its color is white (255,255,255,255).
It means that if you change the color of an image it will add this color to this image.
For instance, if you set the color of an image to red, the image colors will become more similar to red than the original image.
As I see it you have 2 ways to perform the white flash:
A) Add another image of a white rectangle that covers all the screen and set it's alpha color to a low number (the lower the number the lighter flash effect).
In the editor disable this object's renderer and when you want to perform a flash effect enable this object from the code (You can improve this with animations or code to get a smooth flash animation).
B) Install the package "2D Light". This is an experimental package that allow you to render 2d light.
This package contains a lot of components that allow you to stimulate light.
I have found a way to do this.
I created a new PNG that only contains white shapes on a transparent background.
There are about 20 pieces that match the shapes of my level tilemap.
Now I just create a new (white) tilemap above the level tilemap in the shape of the highlight.
Then I set the alpha of the white tilemap in code.
It works :)

Getting a round blur halo around a square button in order to place over an image

I updated this with an image of what I am trying to achieve, its a blur on a image which adds a touch of shading and even white on white is visible. I am basically working on putting buttons of various kinds on top of images and welcome's any and all assistance on best practices. I know facebook does this in some way as an app example.
To summarize what I am trying to achieve. I have an image that takes up the full screen and I would like to place a button on top of that image that does something like blur around it with padding so that it looks clean on top of the image. My button is a heart png, red outline with clear inside, and is represented as a square because of the irregular shape. I would like the heart to be on top of a circle that does something like blur the image so it can always be seen.
I found a number of similar solutions to this problem using UIBlurEffect but nothing that specifically addresses the "square image" and how I would control making the blur circle larger/smaller in terms of the padding around the square. I tinkered with creating a UIView that was transparent, placing a circle with a blur into there and then adding the button with their centers aligned but this seems like an incorrect approach and wasn't quite working. I suspect that for people with expertise this is something where I just need to have the correct usage of the UIBlurEffect.

Unity - Render Texture from Camera's targetTexture produces seams

I am attempting to render a specific section of my scene using a separate camera, and a render texture. That object is on a separate layer that the main camera is not rendering, but a separate camera is. The secondary camera has a target texture set to be a render texture that I have created. Everything is working as intended except for the fact that the object, when rendered to a texture, has a bunch of seams that are not present when rendering directly to the screen.
What it looks like when rendered directly to the screen:
Correct
What it looks like when rendered to a texture, and then displayed on a quad in the scene:
Incorrect
Notice how the second image has a bunch of transparent "lines" in between the sprites where there shouldn't be any.
I am using a basic transparent shader to display the render texture on the quad (since the background isn't part of the render texture, just the black crowd part). I have tried a number of different shaders, and none of them seem to make a difference.
The render texture's settings are: Width: Screen.width Height: Screen.height Format: RenderTextureFormat.ARGBFloat;
Unity Version: 5.2.3f1 - iOS Platform
Note: The reason I am doing this is so that I can apply a "Blur" image effect to the texture, and make the crowd in the foreground appear to be out of focus. Any alternative suggestions for how to do this are also welcome.
I'm not quite sure -- but it almost sounds like you have line ghosting. You may want to give this a read and let me know if that's what you're dealing with or not:
The reason for this is due to how the texture image was authored, combined with the filtering that most 3d engines use when textures are displayed at different sizes on screen.
Your image may have coloured areas which are completely opaque, coloured areas which are partially transparent, and areas which are completely transparent. However, the areas where your alpha channel is completely transparent (0% opacity) actually still have a colour value too. In PNGs (or at least, the way Photoshop exports PNGs) seems to default to using white for the completely transparent pixels. With other formats or editors, this may be black. Both are equally undesirable when it comes to use in a 3d engine.
You may think, "why is the white colour a problem if it's completely transparent?". The problem occurs because when your texture appears on screen, it's usually either upscaled or downscaled depending whether the pixels in the texture's image are appearing larger or smaller than actual size. For the downsizing, a series of downscaled versions get created during import. These downscaled versions get used when the texture is displayed at smaller sizes or steeper angles in relation to the view, and is intended to improve visual quality and make rendering faster. This process is called "mip-mapping" - read more about mip-mapping here. For upscaling, simple bilinear interpolation is normally used.
The scaled versions are usually created using simple bilinear interpolation, which means that the transparent pixels are mixed with the neighbouring visible pixels. With the mipmaps, for each smaller level, the problem with the invisible mixing with the visible pixel colours increases (with the result that your nasty white edges become more apparent at further distances away).
The solution is to ensure that these completely transparent pixels have a colour value which matches their neighbouring visible pixels, so that when the interpolation occurs, the colour 'bleed' from the invisible pixels is of the appropriate colour.
To solve this (in Photoshop) I always use the free "Solidify" tool from the Flaming Pear Free Plugins pack, like this:
Download and install the Flaming Pear "Free Plugins" pack (near the bottom of that list)
Open your PNG in photoshop.
Go to Select -> Load Selection and click OK.
Go to Select -> Save Selection and click OK. This will create a new alpha channel.
Now Deselect all (Ctrl-D or Cmd-D)
Select Filter -> Flaming Pear -> Solidify B
Your image will now appear to be entirely made of solid colour, with no transparent areas, however your transparency information is now stored in an explicit alpha channel, which you can view and edit by selecting it in the channels palette.
Now re-save your image, and you should find your white fuzzies have dissappeared!
Source: http://answers.unity3d.com/questions/10302/messy-alpha-problem-white-around-edges.html
Turns out that the shader I was using for my scene was using "Blend SrcAlpha OneMinusSrcAlpha" for some reason, when it should have been using "Blend One OneMinusSrcAlpha". This was causing objects with alpha less than 1 to make the objects under them become semi-transparent as well exposing the camera's clear colour background.

Incorrect white matte behind antialiasing on imported sprites

I'm importing a sprite into Unity, and adding it to a Screen Space Overlay canvas to use for a UI.
The image I'm importing looks exactly as I want it, but in Unity the anti-aliased edges look like they're going to a white background color, instead of just fading over whatever is actually behind them.
I'm using these import settings:
I'm using a default UI/Image component to add it to the canvas.
This is the image I'm importing - it's a 32 bit PNG exported from Fireworks: (also shown over a black background)
Just to confirm, this looks fine everywhere else in Unity, preview panels, pickers etc. I am packing this sprite using the built in Sprite Packer if that changes anything.
And the final result:
How can I get rid of these artifacts on the corners?
The problem was the RGB values of the transparent pixels. By default they are white, and any scaling operations cause this white color to be blended with the partially transparent pixels.
I essentially made a slightly larger version of the button background shape, put it in a layer behind everything, and then wrote back into the alpha channel making it transparent. This means that the neighboring pixels are then the same color as the partially transparent ones.
The end result:

How to create an art asset that can be dynamically colored in software?

I asked this question on the Graphic Design site, but it includes a programming component that might be better answered here.
Specifically, I have a bunch of photographic crayon images. I would like to remove the color from one to produce a neutral image that I can load into an iPhone app that I'm writing and dynamically color. The crayon images have dark regions (shadows) and light regions (shine) which I would like to preserve. I will be dynamically coloring it with many different colors, ranging from white to rainbow colors to black.
My first inclination is to turn the image into a grayscale image and then somehow turn the color channel into an alpha channel, and change the color of all pixels to black. Then I could use it as a mask. However, this would only preserve the shadows, and I would lose all the highlights.
Any ideas?
Two options come to mind:
Make a grayscale version that could be tinted as you said, with the shadows and highlights simply white and gray.
Make an outline, i.e. an image with alpha that had 0% opacity in the colored parts, say 10% white over the highlights, 10% black on the shadows, and 100% black/dark gray for the lines/edges. The idea being that you could put any color under the outline and it would look right.