RenderSettings customReflection texture has invalid type, 2D given while only CUBE is supported. Custom reflection texture will not be used - unity3d

so I started an unity project w ps1 graphics style, and it been smth like 2 days I have this error but the game still working kinda fine, can someone explain me why this happening? Thanks!
(Im not using URP)
Full error :
RenderSettings customReflection texture has invalid type, 2D given while only CUBE is supported. Custom reflection texture will not be used in UnityEditor.EditorApplication:Internal_RestoreLastOpenedScenes ()

In the Texture 2D Import Settings, change Texture Shape from 2D to CUBE.
In your project, click on your texture. Import settings are found in the inspector.
Cube defines the Texture as a cubemap. You could use this for Skyboxes or Reflection Probes
, for example. This type is only available with the Default, Normal Map, and Single Channel Texture types.
From the Texture Shape Reference of Texture Importer Documentation.

Related

Do you know how to do an inverted mask/shader or any other way to see onlu part of an object within a sphere boundary in Unity?

Cut out sphere
What im looking for it this exactly behaviour but what i need is that is compatible with URP, as you may see im just starting with Shaders
could you give me any guidance on how to update this to URP?
i have look up for stencil shader / cut out / Buffer
i have replicated the portal stile ones but i need the object to be able to grow like a tree but if its outside the sphere should not show
You can do it with Sphere Mask node :
And link Out to the Alpha property (if your Surface Type is set to Opaque then, check the Alpha Clipping in your Graph settings).

Unity Point-cloud to mesh with texture/color

I have a point-cloud and a rgb texture that fit together from a depth camera. I procedurally created a mesh from a selected part of the point-cloud implementing the quickhull 3D algorithm for mesh creation.
Now, somehow I need to apply the texture that I have to that mesh. Note that there can be multiple selected parts of the point-cloud thus making multiple objects that need the texture. The texture is just a basic 720p file that should be applied to the mesh material.
Basically I have to do this: https://www.andreasjakl.com/capturing-3d-point-cloud-intel-realsense-converting-mesh-meshlab/ but inside Unity. (I'm also using a RealSense camera)
I tried with a decal shader but the result is not precise. The UV map is completely twisted from the creation process, and I'm not sure how to generate a correct one.
UV and the mesh
I only have two ideas but don't really know if they'll work/how to do them.
Try to create a correct UV and then wrap the texture around somehow
Somehow bake colors to vertices and then use vertex colors to create the desired effect.
What other things could I try?
I'm working on quite a similar problem. But in my case I just want to create a complete mesh from the point cloud. Not just a quickhull, because I don't want to lose any depth information.
I'm nearly done with the mesh algorithm (just need to do some optimizations). Quite challenging now is to match the RGB camera's texture with the depth camera sensor's point cloud, because they of course have a different viewport.
Intel RealSense provides an interesting whitepaper about this problem and as far as I know the SDK corrects these different perspectives with uv mapping and provides a red/green uv map stream for your shader.
Maybe the short report can help you out. Here's the link. I'm also very interested in what you are doing. Please keep us up to date.
Regards

how to import colored mesh from meshlab to unity!(may thouth 3ds max)

I'm very new.I have a ply file,with some faces and colored point cloud.My finial aim is use unity to import this mesh with color!
I find some way ,it seems need 3ds max.But i even make mesh have color in 3ds max.
I tried :
1 In mashleb,filters-texture-per vertex texture function and convert PerVertex UV into perwedge UV.And i get a obj with mtl(only 1KB).Of courese ,it doesn't works.
2 I try filters-texture-transefer Vertex color to texture .and i get i png.But it seems only a side color of mesh.
3 I try filters-color creation and prcession-transfer color:vertex to face.And i get a obj with a little large mtl file.This action make it is very slow to open this obj file in meshlab and 3ds max and unity.And it doesn't works.
I think the color system of 3ds max and unity is different from meshlab.But i have no idea how to do!Any suggesion is OK !Thanks!
It seems after filters-texture-transefer Vertex color to texture,and save it .It can have some texture but wrong like this: pictures here
error 1
I think some thing wrong with getting texture.Any suggestion?
I found ansner in youtube:
https://www.youtube.com/watch?v=6wP_e37t7PI
3. Filters -> Texture -> Parametrization: Trivial Per-Triangle
Save project
Filters -> Texture -> Transfer Vertex Attributes to Texture
Export mesh as obj. file
I don't know reason but it really works.

Flatten 3D object to create a template for a 2D texture map

I would like to create a texture map for a 3D car model I have. I am not sure where to start. I thought maybe I could unwrap the 3D object to a 2D image and then use this as an outline to draw my texture. Is this possible, or is there a simpler solution?
Thank you in advance!
I would like to create a texture map for a 3D car model I have. I am not sure where to start
What you are asking about is called UV mapping.
"UV mapping is the 3D modeling process of projecting a 2D image to a 3D model's surface for texture mapping."
Source: https://en.wikipedia.org/wiki/UV_mapping
UV mapping is normally done when creating the model in 3d modelling software, although there may be assets in Unity able to do the same. To my knowledge Unity is not able to directly UV map.
You can however, change the texture of an object inside Unity as well as assign objects various colours and materials.
maybe I could unwrap the 3D object to a 2D image and then use this as an outline to draw my texture
To my knowledge you need 3d modelling software to do so, but yes, it is possible.
You can try to change it through scripting, but I'd recommend looking into 3d modelling software instead as I believe that if it is possible it will be bothersome.
3D modelling software I know of:
Blender - Free
Maya - Licensed
3DS Max - Licensed

How to do texture mapping in openglES? ( Mapping 2D face into a 3D mesh )

I need to convert a 2D face image to a 3D image. For this I thought of using texture mapping with openglES. I tried a lot googling to find some samples I couldnt get any. Can some one guide me to do this?
Input: 2D image
Output : 3D image
Platform : ios
As you know, OpenGL is using 3D or 2D vertices that has a few attributes such as position, normal value, color, texture coordinate. So you should set these values first and you can render.
In ES 2.0 clearly you have to give these values to Vertice Shader
and then you have to transfer two values texture coordinate , normal value to Fragment Shader
and then in Fragment Shader, you can control these values with sampler texture for rendering your face object.
If you work In IOS, It's going to be very help .
Explanation :
http://ofps.oreilly.com/titles/9780596804824/chtextures.html
Source Code :
http://www.developers-life.com/iphone-3d-samples.html