Why don't camera mesh appear in Unity? - unity3d

I have two cameras and for each I've associated a Mesh Filter and Mesh Renderer (which are just red and green cubes. I notice that in the Scene view I can see one cameras position from the other camera. But when I actually play the scene the camera I'm looking through cannot see the cube of the other camera?
EDIT: Adding some clarity I hope.
When I click the play symbol.
When I switch to the green camera it cannot see the red camera.
When I switch to the red camera it cannot see the green camera.
Even though both cameras has a mesh attached to them which appears both in the Scene and Game previews (screenshot above).

The camera at the position of the green cube cannot see the green cube because it is inside the cube and the standard material uses backface culling, which means that only the fronts of the faces of the mesh are drawn and not their backs. And the standard Unity cube has all its faces facing outwards.
Also, I assume the cube would have been clipped regardless, since it's too close, under the near clipping distance of the camera. So anything outside the camera frustum will not be drawn.

Related

Objects disappear at certain angles

So I made a water shader and for some reason in-game, the object disappears at certain camera angles. the material is also flipping the plane 90 degrees when i place it on an object
ShaderGraph
ShaderGraph
FlippedPlaneProBuilder
VideoOfError
I tried changing the alpha but that had no effect
I'd wager that your plane is moving into the near-clip plane of the camera. Anything that is not within the camera's viewable frustum is clipped away (not rendered).
read more here

Rendering camera output to a Render Texture and displaying with a clear background

I have a hololens app made in unity using the URP. I added a second camera of render type base and set the output texture to a custom render texture. I set the camera background type as uninitialized. I created a material and set the render texture as the surface input base map. I then created a plane, dragged the material onto it, and positioned it in the field of view of the main camera. I added a cube in the field of view of the second camera. I followed this link to do this... Rendering to a Render Texture
The result is I see the plane and the output of the second camera (cube) in the main camera, which is what I want. But I see the entire plane as well as a black background. Is there a way that I can make the plane appear transparent so only the cube is displayed?
Camera
Render Texture
Material
Plane with render texture
Main camera with cube and black background
Your plane's material is set to Surface Type -> Opaque.
Changing that to
Surface Type -> Transparent
Blending Mode -> Alpha
should solve your issue.

Unity scene and game tab are stretched and colliders are are not in sync with the visual mesh

Here is a cube:
a cube with the dimensions (1,1,1) however to the eye the cube looks more of a cuboid
What is even more annoying is that the box collider is not lining up with the mesh.an image of the same cube rotated a bit but the box collider not lining up
How do I fix this guys?
The same stretching happens equally in the game and scene tabs.

Unity ARKit: project camera frame on detected plane as texture to render a top down view with second camera

I'm using Unity3D and ARKit and my goal is to project a camera image onto a plane detected with ARKit. So that I can render my plane with a second orthogonal camera in a brid's eye view. The result would be a textured map from top down view. The whole thing doesn't have to run in real time in the first version, it only should work for one frame on button click.
My current steps are:
freeze the frame on button click (ReadPixels to a UI Image)
duplicate the ARKit plane mesh, so that the plane is no longer extended and tracked
Now comes the problem, how do I get the camera image (which is stored in my UI image) correctly perspectivelly transformed on my plane? Do I have to do the transformation on the texture or in the shader?
How do I handle the case if the plane is larger than the current camera image? Do I have to crop the plane first? Like in the picture (case 2) only the green area can be textured.
From the ARKit plane geometry I can get the 3d vertices and the texture coordinates of the plane. I also can transfrom the World Coordinates to Screen, but I'm struggling with how and where to do the image transformation from screen to my detected plane.

Unity3D Gui Sphere having distorted edges

I'm trying to make a make switch board with circular button.
For it i'm using Quad as a base and then overlaying 3D Spheres for the buttons.
For the texture of sphere's, I have made a Material to which a rectangular image is set.
in preview it looks good with all smooth edges, however, in game preview when I look closely at the sphere's their edges are distorted/pixelated.
Is there any restriction which should be adhered to make the sphere's have perfect smooth edges.
or I must be missing something.
Image uploaded. The Quad is of White color and there is a light source.
the texture was imported and overridden for platform for 2048.