Sphere 360-texture low quality - unity3d

I'm creating an app to show 360 images with Cardboard.
I created a scene in Unity using Cardboard camera and sphere. I mapped 360-image to a sphere texture.
When viewing the texture is low quality and has sawtooths so the details are not good quality.
Any ideas to solve this texture problem? I tried a script which creates a different kind of sphere but it didn't solve the problem.

You need to use an icoshpere for this to work, you'll still gonna get some distortion near the polls, but it's far better than the uv ones that Unity provides.
The second thing is that you'll need a high detail icoshpere for this to work, as you'll need more vertexes.
The third thing is the textures quality and size. I think the default fov for Unity is around 60, but you'll map the texture for a fov of 360, so you'll need textures of higher size compared to the on screen texture you are using.
You can look over this article if you want more details about the differences between icoshperes and uv spheres, or just go to the bottom of the article and download the unity project. The project includes already made icoshperes and you can experiment with them to find out which one is best suited for your project. I'm using the Octahedron Sphere 4 R1. Any less polys and there are too many distortion, any higher one and the fps drops to much.

Related

Unity 3D 2D Imported Sprites Pixelated

I have this image generated thanks to PowerPoint:
We can see here, the image is not pixelated.
But when I import this in Unity 3D, the result is:
Here you can see the sprite's parameters:
Am I using the correct tool for my sprite creation? (PowerPoint)
If the answer is "No", which tool can I use for avoid this kind of problem?
If the answer is "Yes", how can I avoid this pixelization of my sprite in Unity 3D?
Thanks a lot for your help!
PowerPoint is not the best image editor :D But your sprite looks correct, possible you just set small scale in Unity Scene window for it.
Try increasing your sprite Scale in Scene window. Select it, next in Inspector increase X Y Z Scale parameters in Transform component (it should be on top).
I just tried your image in my Unity editor and it seems fine. Make sure your Sprite Renderer transform scale is set to (1,1,1). For me even that seems not to affect the quality but it is a best practice not to have different distorted scales for everything in your scene.
One tip for improving your sprite quality is to export it in a POT resolution. Meaning that the resolution of the image should be divisible with 4. This way Unity will be able to compress the image with a much higher precision and quality. One resolution example of that is 800x800 or 1920x1080 etc.
Make sure your build target is set to Standalone and not other platform. If you are set to Android for example. check the Android specific compression in your sprite import inspector. That might also affect the quality.
To answer your question on what image editor to use, the best one, in my opinion, is Adobe Photoshop. If you don't want to pay for it, just search for any free image editing tool. But stop using PP, I'm not exactly sure how you have come up with that.

How can I use baked lighting on sprites? / How to light up a large area in 2D?

I'm having trouble figuring out how to light up large area(s) of sprites in Unity 2D. My previous knowledge on Unity's lighting is zero.
I first tried using a large amount of point lights and using the "Sprites/Diffuse" material, but about only five would actually render at a time, so I guess there's a limit on that.
Then I tried putting in an area light. That didn't do anything, so that's when I started doing research about baked lighting on sprites (and baked lighting in general). I found stuff like this but I couldn't get it to work either because it's outdated or because I don't know what I'm doing. Other answers I've come across seem to assume that the reader knows anything about lighting in Unity in the first place which, to be honest, I don't. Unity's documentation website had some information on it, but no tutorials that go into how to set up baked lighting.
I've tried a bunch of different combinations of materials (like using the "Standard" shader for the sprites instead of "Sprites/Diffuse", emission, ect.) and I enabled "Baked Global Illumination" in Lighting>Settings.
If baked lighting isn't possible on sprites (or isn't worth the trouble), what are the alternatives?
Edit: I made sure not to have the lights pointing the wrong direction, and I do realise that Unity2D is just like painting onto a piece of paper in Unity3D. I was able to get point lights to work, but only a few at a time. I don't need to do the entire screen at once, I need to do a large specific area at once.
some tips...
working with sprites your in 2d... when you add a light, switch to 3d mode, and rotate to make sure your light is pointed at your objects, and oriented so as not to be on the same plane, or level with them, as this will cast all the light behind them.
if your trying to light up everything on the screen(in camera) attach an area light to the camera at the cameras position, point it where the camera points, and then in the inspector on the right, you can change its variables. intensity, range, width, height etc.
Emissive Texture:
https://www.youtube.com/watch?v=oa6kW5HhRd4
For some reason, I never even thought about going into the asset store. I found this for free, and it looks like it will work: Light2D.

Unity texture blurred despite everything being highres and uncompressed

I have a model in Blender with perfectly highres textures. I export it as fbx, import it to Unity, import textures to unity at full res (and no mipmapping, point, no compression, etc) and still the texture looks extremely blurred. Any hints why that might be?
Here's a comparison of the face in Blender vs Unity.
Here are the relevant, currently applied settings. I've tried many, but imho those (extreme) settings should make 100% sure that the texture is on maximum quality.
EDIT: Interestingly, a close-up image reveals a seemingly high detail on the clothes below the face, yet not on the face itself. Maybe that's just my imagination. Have a closer look here
Thanks for some help!
It's really hard to tell in the comparison shot you linked (as that itself is so low resolution), but the texture resolution looks the same. It doesn't look any more blurry in the Unity image.
To me it looks more like a shader + lighting issue. Are you using the Standard shader? Do you have additional maps that are not applied in the Unity material (like a normal map)?
Try lighting him with some point lights instead of just a hard directional light.
On a side note, 8192 is an absurdly high resolution for that guy. With how blurry the texture looks (in both Blender and Unity) you should be able to get the same detail with 512, at most 1024.

Applying textures to a plane game object results in spreading

As visible in the attached image, when I apply a grass or ground texture to a plane, it all spreads around and looks like it is being stretched to fill the whole floor...
I cannot seem to find any settings that would address this. I have already triedx setting the wrap mode to repeat, according to this manual page...
Could someone please help me understand why this is caused and how I can fix it?
In the shader options you can change tiling on the X and Y direction. Greater tiling means more repeats of the texture on the plane. Tiling is defaulted to 1 when dragging a texture on which explains the stretched out look you got. For a fairly large plane try 100 by 100 or whatever fits.
To open the shader settings for the material you are using, select Forest Floor inside the inspector panel.
There is a tool on the asset store, the "Auto Texture Tiling Tool", which automates the process and has a ton of features for applying and tiling textures. It is way more useful than the Tile settings on materials. Since it is independent from shaders and materials, you can use the same materials with different tiling settings. Check it out here: https://assetstore.unity.com/packages/tools/utilities/auto-texture-tiling-tool-41613

quartz 2d / openGl / cocos2d image distortion in iphone by moving vertices for 2.5d iphone game

We are trying to achieve the following in an iphone game:
Using 2d png files, set-up a scene that seems 3d. As the user moves the device, the individual png files would warp/distort accordingly to give the effect of depth.
example of a scene: an empty room, 5 walls and a chair in the middle. = 6 png files layered.
We have successfully accomplished this using native functions like skew and scale. By applying transformations to the various walls and the chair, as the device is tilted moved, the walls would skew/scale/translate . However, the problem is since we are using 6 png files, the edges dont meet as we move the device. We need a new solution using a real engine.
Question:
we are thinking of instead of applying skew/scale transformations, that if given the freedom to move the vertices of the rectangular images, we could precisly distort images and keep all the edges 100% aligned.
What is the best framework to do this in the LEAST amount of time? Are we going about this the correct way?
You should be able to achieve this effect (at least in regards to the perspective being applied to the walls) using Core Animation layers and appropriate 3-D transforms.
A good example of constructing a scene like this can be found in the example John Blackburn provides here. He shows how to set up layers to represent the walls in a maze by applying the appropriate rotation and translation to them, then gives the scene perspective by using the trick of altering the m34 component of the CATransform3D for the scene.
I'm not sure how well your flat chair would look using something like this, but certainly you can get your walls to have a nice perspective to them. Using layers and Core Animation would let you pull off what you want using far less code than implementing this using OpenGL ES.
Altering the camera angle is as simple as rotating the scene in response to shifts in the orientation of the device.
If you're going to the effort of warping textures as they would be warped in a 3D scene, then why not let the graphics hardware do the hard work for you by mapping the textures to 3D polygons, then changing your projection or moving polygons around?
I doubt you could do it faster by restricting yourself to 2D transformations --- the hardware is geared up to do 3x3 (well, 4x4 homogenous) matrix multiplication.