How to play 3D animation on a 2D canvas in Unity - unity3d

I like to play animation on a Canvas.
I made a canvas as shown in the following image.
I like to play a golfer animation on the green color canvas.
Is it possible?
I have animation model as shown in the second figure.
I like to play that golfer animation on the canvas.
How can I do that?
I drag and put under canvas as child object, it doesn't work.

As I explained in my comment, I would do as follow :
Put your object in a specific layer (called MyLayer for the sake of the example)
Set the Culling mask of a new camera to render only this specific layer
Uncheck the MyLayer in the Culling mask of your main camera in order to prevent the latter to render your model
Set the Clear flags to Depth only of the camera to prevent the latter from rendering the skybox
Create a new Render texture in your project, and drag & drop it in the Render Texture field of your new Camera
Add a new Raw Image to your UI canvas and assign the render texture in the Texture field
Run your 3D animation
Your camera will render the animation into the image on your UI

Related

Rendering camera output to a Render Texture and displaying with a clear background

I have a hololens app made in unity using the URP. I added a second camera of render type base and set the output texture to a custom render texture. I set the camera background type as uninitialized. I created a material and set the render texture as the surface input base map. I then created a plane, dragged the material onto it, and positioned it in the field of view of the main camera. I added a cube in the field of view of the second camera. I followed this link to do this... Rendering to a Render Texture
The result is I see the plane and the output of the second camera (cube) in the main camera, which is what I want. But I see the entire plane as well as a black background. Is there a way that I can make the plane appear transparent so only the cube is displayed?
Camera
Render Texture
Material
Plane with render texture
Main camera with cube and black background
Your plane's material is set to Surface Type -> Opaque.
Changing that to
Surface Type -> Transparent
Blending Mode -> Alpha
should solve your issue.

How to set Camera View a Specific Area in UI

My all project works in UI. One of my scene has videoPlayer and I would like to set my MainCamera to that specific Area. This scene is my Recording scene but for recording I have to use canvas Render Mode and has to be "Screen Space - Camera". How I can set my camera view to that specific area.
I tried 2nd camera with culling mask but didn't work because videoplayer is already my UI's child when camera Renders UI showing all children objects.
Is it possible show to camera whereever I want in UI?
enter image description here
Yessir it sure is. I literally had to do this a couple of days ago. Here is a tutorial/video I followed to figure out how to do it for my application. From the sounds of it, all you need to do is follow the video up to ~3 minutes. Then utilize the end product of getting a camera to showcase something in a UI window for your application.
For reference if the video gets deleted the process is as follows:
Create a new GameObject within the Canvas
Add a Raw Image as a child to to the new GameObject
We want a raw image because Raw Images in particular support the addition of a 2d Texture and this is what we need in order to display the Camera view.
Add a new Camera as a child to the GameObject
Create a new Render Texture in you Project Folder
Add the Render Texture to your Camera's Camera Component in the Target Texture field
Add the Render Texture to you Images Raw Image Component in the Texture field
Move the Camera to what you want to view and it should project to your UI GameObject.
If it doesn't do this correctly or the image has weird aspect ratios, check the dimensions in the Render Texture and your Image. It took me a while to get everything to look normal but it really just takes dimension adjustments.
Hope this helps!

How to apply a shader on multiple sprites' rendering in Unity3D?

In Unity3d, I'm trying to make a simple metaball effect using multiple sprites of a blurred round, and those sprites move randomly.
Thus I'd like the shader to perform a pixel color change from the rendering of all the sprites all together and not one by one.
Here is an example :
The picture on the left shows four sprites with the blurred sprite ; the picture on the right is the result of the shader.
And I have no clue of how to do this.
If I understand your question correctly, what you can do is
Create a new RenderTexture
Move these sprites off-screen, out of the main camera's view.
Point a new orthographic camera at all of the sprites that you've moved off-screen and set this camera's Target Texture field (in the Inspector view) to the render texture. This will save whatever the camera is seeing to that texture.
From here you can render that texture onto the surface of another game object (maybe a Quad?)
Attach a custom shader material to that quad that takes the render texture as input.
Perform whatever operations you wish to the render texture within this shader
Position this quad object in front of your main camera so that the final result gets rendered to screen
Does this make sense?

Second camera does not show UI Image element (but does show sprite)

I am working on a Unity 2D board game similar to WordFeud: a gameboard with tiles (UI Image prefabs generated with code), with underneath the board a user interface with draggable tiles, buttons, scores etc. I want to zoom in on the game board by double tap on the screen.
The game scales with the device resolution using the canvas scaler option as demonstrated here:Charger Games. For the zoom I have added a second camera that renders its output to a render texture as explained here:Gamasutra. The render texture will eventually take the position of the game board. In the example screens attached it is a bit smaller than the full board to show what's happening.
In the trial setup I am able to perform the zoom with the second camera on a sprite (dragged to the scene) but the second camera fails to render a UI Image element if added to the scene. Yet this is what my game is composed of: UI Image elements.
Question: how do I get the second camera to show UI Image elements on the render texture?
Attached an image with combined screen dumps of the setup. From left to right: 1) the main canvas with the scaler enabled (it uses the UI image - game board to fill the background), 2) second camera setup (note that camera preview shows the Wordfeud sprite, but not the UI Image test), 3) params of WordFeud sprite, 4) params of UI Image test, 5) params of RawImage - render texture (this is a child of Minimap canvas).
I does not matter if I put the sprite or the UI Image in the default layer or the UI layer, the result in both cases is the same: the sprite is rendered, the UI Image is not.
Unity version 5.3.3.f1

How can i make the second camera to display in the bottom right corner in a small window only specific gameobject?

I added a new layer called it: CameraLayer.
I have one Camera as child under the first ThirdPersonController tagged as MainCamera. I set that this camera will show everything but the CameraLayer. In the inspector in Culling Mask. Everything marked but not the CameraLayer. And now i see in the Culling Mask: Mixed. This is the main camera showing everything.
The second camera i added as child under a Cube. And i want that this camera will show only the cube all the time in a small window in the bottom right corner in the Game View window while the game is running.
In this camera Culling Mask i selected only the CameraLayer and unselected the rest. So in this camera in the Culling Mask i see: CameraLayer.
But when i'm running the game i don't see the Camera under the Cube at all.
In the screenshot i marked in black where should the Camera under the Cube should be and display the Cube:
1) Create a Canvas as child of your first Camera then create a RawImage as a child of this newly created Canvas.
This is this Image that will render your second camera video so move it at the bottom right of the view of your first camera.
2) Then Create a RenderTexture and assign it to the Texture property of the RawImage you just created
3) And finally assign this RenderTexture to the Target Texture property of your second Camera.
This Target Texture will be used by Unity as a buffer Texture for your Camera. Your scene should now look like this: