I made 2d outline shader but The outline does not render beyond the boundary of the sprite.
so I need to resize shader area.
Is there a simple way to solve it, other than remaking sprites and changing positions in the vertex shader?
In principle you can only draw within the geometry of your object.
In theory, you could tweak the geometry in the shader to expand the border but I have no idea how to do it reliably.
Almost all of the edge shaders I'm aware of use a full-screen post-processing pass to render outlines
Related
I know how to draw a basic graphics,
canvas.drawCircle();
canvas.drawPolygon();
...
Is there any way to draw a frustum make it to 3d? Like those:
The 3D effect in the shapes that you posted most likely get their effect from shaders, not from a frustum. There are some limited abilities to use some shaders in Flutter, you can read more about it here: https://wolfenrain.medium.com/flutter-shaders-an-initial-look-d9eb98d3fd7a
I've mocked up what I am trying to accomplish in the image below - trying to pinch the pixels in towards the center of an AR marker so when I overlay AR content the AR marker is less noticeable.
I am looking for some examples or tutorials that I can reference to start to learn how to create a shader to distort the texture but I am coming up with nothing.
What's the best way to accomplish this?
This can be achieved using GrabPass.
From the manual:
GrabPass is a special pass type - it grabs the contents of the screen where the object is about to be drawn into a texture. This texture can be used in subsequent passes to do advanced image based effects.
The way distortion effects work is basically that you render the contents of the GrabPass texture on top of your mesh, except with its UVs distorted. A common way of doing this (for effects such as heat distortion or shockwaves) is to render a billboarded plane with a normal map on it, where the normal map controls how much the UVs for the background sample are distorted. This works by transforming the normal from world space to screen space, multiplying it with a strength value, and applying it to the UV. There is a good example of such a shader here. You can also technically use any mesh and use its vertex normal for the displacement in a similar way.
Apart from normal mapped planes, another way of achieving this effect would be to pass in the screen-space position of the tracker into the shader using Shader.SetGlobalVector. Then, inside your shader, you can calculate the vector between your fragment and the object and use that to offset the UV, possibly using some remap function (like squaring the distance). For instance, you can use float2 uv_displace = normalize(delta) * saturate(1 - length(delta)).
If you want to control exactly how and when this effect is applied, make it so that has ZTest and ZWrite set to Off, and then set the render queue to be after the background but before your tracker.
For AR apps, it is likely possible to avoid the preformance overhead from using GrabPass by using the camera background texture instead of a GrabPass texture. You can try looking inside your camera background script to see how it passes over the camera texture to the shader and try to replicate that.
Here are two videos demonstrating how GrabPass works:
https://www.youtube.com/watch?v=OgsdGhY-TWM
https://www.youtube.com/watch?v=aX7wIp-r48c
I am new to writing shaders. I want to use a texture for 6-sided skybox in unity and I want that texture to be repeated several times also called tiling.
But the default 6-sided skybox shader in unity doesn't have tiling option. Can anyone write a custom shader for 6-sided skybox in unity which has option to tile textures? I also want an option to apply a color tint on the texture if possible. Thanks in advance.
Tiling can be achieved by multiplying texcoord by the number of tiles you want. Or in Surface shader it's uv_YourTex (likely MainTex) instead of texcoord. Writing from a phone so can't post an example, but it's really just one multiplication.
I don't know your specific scenario, but I needed to get more detailed sky with not very detailed texture and instead of UV I used view direction to sample a texture. It made it look like clouds in the distance are further away + clouds can move as you move . It's in this asset.
View direction sampling will not help if you are trying to make space though, which seams to be the case.
Also IMHO tiling on the skybox might be too visible.
I have a need for setting up clipping planes that aren't perpendicular to the camera. Doing that for the far plane was easy: I just added a shader that clears the background.
I just can't figure out how to do the same for the near clipping plane. I've tried to think of solutions dealing with multiple shaders and planes, a special cutting shader, having multiple cameras for this or somehow storing the view as a texture, but those ideas are mostly imperfect even if they were implementable. What I basically need is a shader that would say "don't render anything that's in front of me". Is that possible? Can I eg. make a shader to make the passed pixels "final"?
I am making a 2d game in the perspective of Terraria/Starbound. I want the lighting to look similar to this:
Ive tried to get lighting like this by adding a material on all the sprites in my game and then giving them a sprite diffuse shader. Then I made a point light wherever I needed light. There where two problems with this though: 1) Where the light was most intense, it was draining the color of a sprite and made it lighter. 2) I noticed a big FPS drop (And I only had 1 point light!).
Is there any way of achieving lighting like this without having to write my own lighting engine? Ive search the asset store and Ive searched to see if unity has any way of handing 2D lighting from this angle but I have found nothing.
If I do have to write my own lighting engine, would that be to complex for someone who is relatively new to unity and has only had ~ 8 months experience?
Assume you are using tile map.
You need to have a field of view map, which can be achieved by reading this: http://www.redblobgames.com/articles/visibility/
Using such map, you know exactly the color tinting for each tile. Now, just blend the color to the SpriteRenderer of every tile on the map.
Somebody already created a line of sight plugin:
http://forum.unity3d.com/threads/light-of-sight-2d-dynamic-lighting-open-source.295968/
Here's my hacky solution on GitHub
There's 2 cameras.
Empty tiles on the tilemap are filled in with white blocks (only one camera renders this)
A gaussian blur is applied to the camera rendering the white blocks
Then, blend the two cameras, darkening everything not covered by the white blur.
You can adjust the "light" penetration by changing the white tile's sprite's Pixels Per Unit.