How does one Render a texture directly to the Camera in Unity - unity3d

In Unity, I have 2 images in the form of a texture that I am merging together (one overlaying the other). I do this in a Compute Shader and put the results on a RenderTexture. I want this RenderTexture to be everything that the output camera sees.
I have found articles saying to use a the ReplacementShader property of the camera, but I couldn't get that to properly work.
Currently I have simply put the RenderTexture onto a UIRawImage that covers the whole UI Canvas so that the entire camera is filled. This however has a lot of lag and is obviously a suboptimal solution.
So how does one output the Rendertexture or the compute Shader result directly onto the camera. Thanks.

You could probably use OnRenderImage
Event function that Unity calls after a Camera has finished rendering, that allows you to modify the Camera's final image.
and use Graphics.Blit
Copies source texture into destination render texture with a shader.
and do something like e.g.
// This script goes onto your according Camera
public class RenderReplacement : MonoBehaviour
{
public RenderTexture replacement;
void OnRenderImage(RenderTexture src, RenderTexture dest)
{
// To overwrite the entire screen
Graphics.Blit(replacement, null);
// Or to overwrite only what this specific Camera renders
//Graphics.Blit(replacement, dest);
}
}
Where
dest
The destination RenderTexture. Set this to null to blit directly to screen. See description for more information.
Note as mentioned in the API the OnRenderImage is called after this Camera already finished rendering. So in order to make this more efficient - since we basically throw away that render - simply make sure that camera basically isn't rendering anything by disabling all Layers and e.g. let it only render a single color background

Related

Unity: Correctly Modifing a RenderTexture in VR

I have basic working VR Application. At a given point. I would like to show a static image that covers the screen. I had the idea to load it into a material and the modify the render texture. Like so:
void OnRenderObject() {
Graphics.Blit(RenderTexture.active, RenderTexture.active, mat);
}
However this doesn't really work. The player shows black as does the left eye. But the right eye shows the image.
How (and where) should I apply the mat to the render texture so that it looks good on VR AND appears correctly in the player as well?

How to set Camera View a Specific Area in UI

My all project works in UI. One of my scene has videoPlayer and I would like to set my MainCamera to that specific Area. This scene is my Recording scene but for recording I have to use canvas Render Mode and has to be "Screen Space - Camera". How I can set my camera view to that specific area.
I tried 2nd camera with culling mask but didn't work because videoplayer is already my UI's child when camera Renders UI showing all children objects.
Is it possible show to camera whereever I want in UI?
enter image description here
Yessir it sure is. I literally had to do this a couple of days ago. Here is a tutorial/video I followed to figure out how to do it for my application. From the sounds of it, all you need to do is follow the video up to ~3 minutes. Then utilize the end product of getting a camera to showcase something in a UI window for your application.
For reference if the video gets deleted the process is as follows:
Create a new GameObject within the Canvas
Add a Raw Image as a child to to the new GameObject
We want a raw image because Raw Images in particular support the addition of a 2d Texture and this is what we need in order to display the Camera view.
Add a new Camera as a child to the GameObject
Create a new Render Texture in you Project Folder
Add the Render Texture to your Camera's Camera Component in the Target Texture field
Add the Render Texture to you Images Raw Image Component in the Texture field
Move the Camera to what you want to view and it should project to your UI GameObject.
If it doesn't do this correctly or the image has weird aspect ratios, check the dimensions in the Render Texture and your Image. It took me a while to get everything to look normal but it really just takes dimension adjustments.
Hope this helps!

How to apply a shader on multiple sprites' rendering in Unity3D?

In Unity3d, I'm trying to make a simple metaball effect using multiple sprites of a blurred round, and those sprites move randomly.
Thus I'd like the shader to perform a pixel color change from the rendering of all the sprites all together and not one by one.
Here is an example :
The picture on the left shows four sprites with the blurred sprite ; the picture on the right is the result of the shader.
And I have no clue of how to do this.
If I understand your question correctly, what you can do is
Create a new RenderTexture
Move these sprites off-screen, out of the main camera's view.
Point a new orthographic camera at all of the sprites that you've moved off-screen and set this camera's Target Texture field (in the Inspector view) to the render texture. This will save whatever the camera is seeing to that texture.
From here you can render that texture onto the surface of another game object (maybe a Quad?)
Attach a custom shader material to that quad that takes the render texture as input.
Perform whatever operations you wish to the render texture within this shader
Position this quad object in front of your main camera so that the final result gets rendered to screen
Does this make sense?

Render a RenderTexture to a mesh

In Unity, I'm updating a Render Texture procedurally (via writing data to it) via a DirectX plugin. I do something like the following to initially create my RenderTexture:
RenderTexture myTexture = new RenderTexture (100, 100, 0);
myTexture.Create ();
transform.GetComponent<Renderer> ().material.mainTexture = myTexture;
transform.GetComponent<Renderer> ().enabled = true;
Then later on I modify the texture as needed. Yet this object's material (what it looks like in the real world) doesn't change. If I click on that object, then click on it's material, and click on this RenderTexture attached to it, I can see it updating, just for some reason it doesn't update on the actual mesh. Why is this? I've tried using different built-in shaders, but that hasn't seemed to help. Is there a way to write a shader to render a RenderTexture to a mesh, as one idea?
I found the best option is simply to use a RawImage instead of a Material, and apply the render texture to that RawImage's texture (not mainTexture, just texture). A material can then even be applied to that raw image if you want to use a shader.

Merge cameras to one

I have two cameras in my scene - one to follow 2D objects and the other for 3D Models (the second camera has other angle of view than first).
I want to create third camera that combines views from that 2 cameras (so, that camera should "see" what the user "see the game"). Is that possbile?
Why I want such a camera? I would like to make a screenshot. I know I can user Application.CaptureScreenshot but it also captures UI elements. Why can't I disable UI elements for a few miliseconds only when screen is captured? Because the effect is ugly (user sees the moment when UI disappears for a while). So my idea is to create third camera that ignores UI layers (but sees exactly what users sees). make render a frame to file and destroy that camera.
My code so far:
private IEnumerator CaptureScreenCoroutine()
{
// Wait till the last possible moment before screen rendering to hide the UI
yield return null;
this.EnableUi(false);
// Wait for screen rendering to complete
yield return new WaitForEndOfFrame();
// Take screenshot
Application.CaptureScreenshot("SomeScreenShor.png");
this.EnableUi(true);
}
With a Monobehaviour that is attached to the last rendered camera you will need something like this.
void LateUpdate(){
Texture2D tex = new Texture2D(Screen.width, Screen.height);
tex.ReadPixels(new Rect(0,0,Screen.width,Screen.height),0,0);
tex.Apply();
}
this will dump all pixels on the screen into "tex" then you can do what you will with it. This might cause a hitch depending on the resolution that you are rendering the cameras at.