How to check for the matching colors in Unity3d - unity3d

I am scanning a Rubik cube by mobile camera. I am getting the colors of the sides, but even same colors have slightly different color value. Is there a way to detect this ?

Related

Unity 2D display "flash effect" above tiles

I want to apply a brighten effect above my scene.
My scene contains tiles, and I want to perform white flash for a few frames by code.
I have already tried this code:
private Tilemap tm;
...
tm = GetComponent<Tilemap>();
tm.color = new Color(0.5f,1.0f,1.0f,1.0f);
This code darkens the scene by a certain color amount, but I wish to brighten it.
Your code is not working because in Unity if you render an image (in your case tile) the original color of the image is when its color is white (255,255,255,255).
It means that if you change the color of an image it will add this color to this image.
For instance, if you set the color of an image to red, the image colors will become more similar to red than the original image.
As I see it you have 2 ways to perform the white flash:
A) Add another image of a white rectangle that covers all the screen and set it's alpha color to a low number (the lower the number the lighter flash effect).
In the editor disable this object's renderer and when you want to perform a flash effect enable this object from the code (You can improve this with animations or code to get a smooth flash animation).
B) Install the package "2D Light". This is an experimental package that allow you to render 2d light.
This package contains a lot of components that allow you to stimulate light.
I have found a way to do this.
I created a new PNG that only contains white shapes on a transparent background.
There are about 20 pieces that match the shapes of my level tilemap.
Now I just create a new (white) tilemap above the level tilemap in the shape of the highlight.
Then I set the alpha of the white tilemap in code.
It works :)

Unity3D - Make texture edges not stretch

I've been searching around for this one for a bit, and unfortunately I can't seem to find any good, consistent results. So, in the Unity UI system, buttons can stretch without becoming pixelated or distorted. This is because the texture is split up into 9 parts - the corners, middle, and sides.
This works because the button's middle and sides are stretched, but not the corners. Then, the button appears not pixelated, at any dimension.
So, the question is as follows: How can I do the same thing for a transparent, unlit texture in 3D space? I have a speech bubble texture on a flat plane that I know how to re-scale to fit the text in the speech bubble.
I've set the texture type to Multiple Sprite, and divided it up into 9 parts. However, I cannot seem to find where I can set the texture to act like the UI button does, and I'm not sure that this is even possible in this way in 3D space.
Is there a way, or should I just make the different parts of the texture different objects, and move them together? That would seem very inefficient and ugly compared to this.
To accomplish what you are asking, you would need to create tiles for this speech bubble and then write a script that procedurally builds a speech bubble based on the plane's scale value. You could also try just changing the texture's Filter Mode to Point.
However I really don't think you should be using textures for this anyway. Why not just use a Unity Canvas and set the Render Mode to World Space? Then you can just set your text box to be a sprite, not a texture, and set its filter mode to Point (See below). This would also make it a lot easier for when you want there to be text in the speech bubble later on.

Unity - Render Texture from Camera's targetTexture produces seams

I am attempting to render a specific section of my scene using a separate camera, and a render texture. That object is on a separate layer that the main camera is not rendering, but a separate camera is. The secondary camera has a target texture set to be a render texture that I have created. Everything is working as intended except for the fact that the object, when rendered to a texture, has a bunch of seams that are not present when rendering directly to the screen.
What it looks like when rendered directly to the screen:
Correct
What it looks like when rendered to a texture, and then displayed on a quad in the scene:
Incorrect
Notice how the second image has a bunch of transparent "lines" in between the sprites where there shouldn't be any.
I am using a basic transparent shader to display the render texture on the quad (since the background isn't part of the render texture, just the black crowd part). I have tried a number of different shaders, and none of them seem to make a difference.
The render texture's settings are: Width: Screen.width Height: Screen.height Format: RenderTextureFormat.ARGBFloat;
Unity Version: 5.2.3f1 - iOS Platform
Note: The reason I am doing this is so that I can apply a "Blur" image effect to the texture, and make the crowd in the foreground appear to be out of focus. Any alternative suggestions for how to do this are also welcome.
I'm not quite sure -- but it almost sounds like you have line ghosting. You may want to give this a read and let me know if that's what you're dealing with or not:
The reason for this is due to how the texture image was authored, combined with the filtering that most 3d engines use when textures are displayed at different sizes on screen.
Your image may have coloured areas which are completely opaque, coloured areas which are partially transparent, and areas which are completely transparent. However, the areas where your alpha channel is completely transparent (0% opacity) actually still have a colour value too. In PNGs (or at least, the way Photoshop exports PNGs) seems to default to using white for the completely transparent pixels. With other formats or editors, this may be black. Both are equally undesirable when it comes to use in a 3d engine.
You may think, "why is the white colour a problem if it's completely transparent?". The problem occurs because when your texture appears on screen, it's usually either upscaled or downscaled depending whether the pixels in the texture's image are appearing larger or smaller than actual size. For the downsizing, a series of downscaled versions get created during import. These downscaled versions get used when the texture is displayed at smaller sizes or steeper angles in relation to the view, and is intended to improve visual quality and make rendering faster. This process is called "mip-mapping" - read more about mip-mapping here. For upscaling, simple bilinear interpolation is normally used.
The scaled versions are usually created using simple bilinear interpolation, which means that the transparent pixels are mixed with the neighbouring visible pixels. With the mipmaps, for each smaller level, the problem with the invisible mixing with the visible pixel colours increases (with the result that your nasty white edges become more apparent at further distances away).
The solution is to ensure that these completely transparent pixels have a colour value which matches their neighbouring visible pixels, so that when the interpolation occurs, the colour 'bleed' from the invisible pixels is of the appropriate colour.
To solve this (in Photoshop) I always use the free "Solidify" tool from the Flaming Pear Free Plugins pack, like this:
Download and install the Flaming Pear "Free Plugins" pack (near the bottom of that list)
Open your PNG in photoshop.
Go to Select -> Load Selection and click OK.
Go to Select -> Save Selection and click OK. This will create a new alpha channel.
Now Deselect all (Ctrl-D or Cmd-D)
Select Filter -> Flaming Pear -> Solidify B
Your image will now appear to be entirely made of solid colour, with no transparent areas, however your transparency information is now stored in an explicit alpha channel, which you can view and edit by selecting it in the channels palette.
Now re-save your image, and you should find your white fuzzies have dissappeared!
Source: http://answers.unity3d.com/questions/10302/messy-alpha-problem-white-around-edges.html
Turns out that the shader I was using for my scene was using "Blend SrcAlpha OneMinusSrcAlpha" for some reason, when it should have been using "Blend One OneMinusSrcAlpha". This was causing objects with alpha less than 1 to make the objects under them become semi-transparent as well exposing the camera's clear colour background.

Unity 2D: Irregular shape window to a different background world?

I want to make an irregular shape display its colors from a different set of images. Currently, I have a flashlight represented by an irregular shape (kite shaped) that when casted over an area, certain objects appear. When the shape is removed, the items disappear. Now, I want to have the background that is within this irregular shape to display an altered version of the background.
I am planning to use a RenderTexture to get its info from a camera that views the flashlight's corrosponding location in the other world, and then use this image as the basis of the flashlight's altered background. However, when I try this, the flashlight shows black instead of the other world. When I texture a plane, the RenderTexture works properly. Anyone have any ideas how to accomplish this?

Unity - How can I control fog color based on skybox color?

I have a gradient on a skybox, and I want the terminal fog color to be the same as the sky color behind the object that is being occluded. But it still needs to be fog and respect distance.
I see one way of doing it without actually using fog, with a gradient of my choosing on a skybox. The skybox gets masked out by the depth map, and then overlaid onto the non-sky geometry. You would be able to look down into the depths, and the blackness below gets darker gradually like a "fog" because it is masked by the depth operation. Look up and see the same phenomenon with the lighter color. Basically control "fog" terminal color by camera angle.
What would be the best way to implement this, and could it be performant on mobile devices?