When images imported into unity gets jagged.
Jagged Image
But the unity 2d pack sprites does not gets jagged:
Jagged Image VS 2D Pack Sprite
I do all tasks like:
1 - Disabling and Enabling Mip Maps
2 - Fit texture size for usage
3 - Change filter mode
4 - Set Compression to None
5 - Importing Images from Photoshop instead of illustrator
Details:
Unity Version 5.5.0f
Images Created in Adobe Illustrator CC 2017
Try the following:
Go into Import Settings by clicking on the file in the project window
Increase pixels per unit to 200-300
For all build platforms:
Increase Max Size to 8192 (or highest)
If you get the option, increase the compressor quality to 100
Not recommended: In the scene, reduce scale and increase width/height
If none of that works:
Use a higher resolution image, or increase import resolution of vector image
Hope this helps!
I solved it.
You should import your textures and sprites in the size that you want for example:
If you need a 64 x 64 texture you should create and import your texture in 128 x 128 size.
By doing this the texture will become anti aliased and a little fade.
Thanks.
Related
I'm having problems to set the right resolution on unity to not have pixel distortion on my pixel art assets. When I create an tile grid, on the preview tab the assets look terrible.
I have an tilemap with 64x32 resolution for each tile.
I'm using 64 pixels per unit.
The camera size is set to 5 in a 640x360 resolution (using the following formula: vertical resolution / PPU / 2).
What I'm doing wrong and what I'm missing?
I don't know how the tiles are defined, but assuming those are rects with textures on topm you could check your texture filter setting and play with it a little, setting it up for example to "anisotropic"
To solve this problem and get an "pixel perfect" view, you need to apply the following formula:
Camera size = height of the screen resolution / PPU (pixels per unit) / 2
This will do the job!
I'm trying to figure out why my Object's textures keep turning white once I scale the object down to 1% (or less) of its normal size.
I can manipulate the objects realtime with my fingers and there is a threshold where all the textures (except a few) turn completely ghost white, as shown below:
https://imgur.com/wMykeFw
Any input to fix is appreciated!
One potential cause of this issue is due to how certain shaders can miscalculate how to render textures when scales are set to low values.
To be able to render this asset so small using the same shader, re-import the mesh with a smaller scale factor (in the mesh import settings), and that may fix it.
select ARCamera then camera, in the inspector, select the cameras clipping plane and increase it(you want to find the minimum possible clipping that works to save on memory, so start at 20000, and work your way backwards til it stops working, then back up a notch).
next (still in the cameras inspector), select Rendering Path and set it to Legacy Vertex Lit
this should clear it up for you
I have been adding textures to SKSpriteNode() and also getting the texture from nodes in order to change them.
When adding textures I can't add a texture over 4000 wide or high without it resulting in a black SKSpriteNode() (the texture exists, its just black)
When getting a texture from a node I have to make sure the result is within 4000 width or height by scaling the node before getting the texture otherwise it is blank again.
This is all fine for my game at the moment but I am wondering if there is an inbuilt limit of 4000, just so I can allow for it.
(there is a reason why I am using such large textures...so it is possible that I might go over 4000 width occasionally)
Check out this helpful chart from Apple:
https://developer.apple.com/metal/limits/
It has a lot of information about graphical limitations. If you want to know the maximum texture size for iOS, find the entry for "Maximum 2D texture width and height".
It depends on what operating systems you are targeting. For example, if you want to support iOS 8 and higher you are restricted to the iOS 8 limit for 2D textures of 4096 x 4096 pixels even though later versions of iOS can support larger textures.
I use raspicam library from here. I can change frame rate at src/private/private_impl.cpp file. After the frame rate to 60, I can receive the frame rate 60, but the object size in the image is changed. I attached two images one is captured using 30fps and another one is captured using 60fps.
Why I have bigger object size using 60fps and how can I have normal object size (same as using 30fps)?
The first image is usign 30fps and second image is using 60fps.
According to description here, the higher frame rate modes require cropping on the sensor for 8M pixel camera. At the default 30fps the GPU code will have chosen the 1640x922 mode, so gives full field of view (FOV). Exceed 40fps and it will switch to the cropped 1280x720 mode. In either case the GPU will then resize it to the size you requested. Resize a smaller FOV to the same size and any object in the scene will use more pixels.Can use 5M pixel camera if no cropping is required.
I should use Field of view, zoom or cropping rather than object size is bigger.
It is also possible to keep the image the same size at higher frame rates by explicitly choosing a camera mode that does "binning" (which combines multiple sensor pixels into one image pixel) for both lower- and higher-rate capture. Binning is helpful because it effectively increases the sensitivity of your camera.
See https://www.raspberrypi.org/blog/new-camera-mode-released/ for details when the "new" higher frame rates were announced.
Also, the page in the other answer has a nice picture with the various frame sizes, and a good description of the available camera modes. In particular, modes 4 and higher are binning, starting with 2x2 binning (so 4 sensor pixels contribute to 1 image pixel) and ending with 4x4 (so 16 sensor pixels contribute to 1 image pixel).
Use the sensor_mode parameter to the PiCamera constructor to choose a mode.
I am a beginner in Unity and I can't understand how the scales of GameObjects work; in the following picture I have a quad and a sprite. The quad's scale should be smaller than the sprite's scale, but in the editor when I touch them they have the same scale of 1.
Short Answer
Every object has a baseline (1, 1, 1) scale when first created. The size for a 2D sprite and 3D quad object are calculated differently for each, which is why they are not the same size in your screenshot.
Long Answer
The sprite is a 2D object and its size is based on the import settings of the sprite. The default size for sprites is 100 pixels is 1 unit.
The quad is a 3D object and its size is based on what 1 unit is in Unity. In Unity's settings, 1 unit is 1 meter, by default. So, the quad is 1 meter cubed, by default.
If you import a sprite that is 100x100, and keep the default 100 pixels is 1 unit, it should be the same size (with an orthographic camera) as the default quad size.