Resolution for 2d pixel art games - unity3d

I'm having problems to set the right resolution on unity to not have pixel distortion on my pixel art assets. When I create an tile grid, on the preview tab the assets look terrible.
I have an tilemap with 64x32 resolution for each tile.
I'm using 64 pixels per unit.
The camera size is set to 5 in a 640x360 resolution (using the following formula: vertical resolution / PPU / 2).
What I'm doing wrong and what I'm missing?

I don't know how the tiles are defined, but assuming those are rects with textures on topm you could check your texture filter setting and play with it a little, setting it up for example to "anisotropic"

To solve this problem and get an "pixel perfect" view, you need to apply the following formula:
Camera size = height of the screen resolution / PPU (pixels per unit) / 2
This will do the job!

Related

How to resize a texture for meters in ARKit

I need to tile a texture across a plane with updating geometry (floor fill), and I need the texture to be scaled to fit real-world dimensions in centimeters. It is a square floor tile of 50cm, and the texture size is 1024 pixels. How do I convert pixels to meters in ARKit? i know that I have to use SCNMatrix4MakeScale on the SCNMaterial diffuse.contentsTransform but not sure what properties to set to get it accurate.
What you might do is use the physical size of SCNNode that you are working with and determine how much squares of 50x50cm could it fit. After you get this coefficient, use it inside the contentsTransform to achieve needed behavior. Please refer to this answer for code snippets and more hints that you might find useful.

Drawing a shape with dimensions in millimeters

I have dimensions in millimeters (mostly rectangles and squares) and I'm trying to draw them to their size.
Something like so 6.70 x 4.98 x 3.33 mm.
I really won't be using the depth in the object but just threw it in.
New to drawing shapes with my hands ;)
Screens are typically measured in pixels (android) or points (ios). Both amount to the old standard of 72 pts/in. Though, now we have devices with different pixel ratios. To figure out an exact size would mean you need to determine the current device's screen size and it's pixel ratio. Both can be done with WidgetsBinding.instance.window... Then you just do the math from there to convert those measurements to mm.
However, this seems like an odd requirement so you may just be asking how to draw a square of an exact size. You may want to look into the Canvas/Paint API which can be used in conjunction with a CustomPainter. Another option is a Stack with some Position.fromRect or .fromRelativeRect and draw them using that setup.

Relationship of video coordinates before/after resizing

I have a 720x576 video that was played full screen on a screen with 1280x960 resolution and the relevant eye tracker gaze coordinates data.
I have built a gaze tracking visualization code but the only thing I am not sure about is how to convert my input coordinates to match the original video.
So, does anybody have an idea on what to do?
The native aspect ratio of the video (720/576 = 1.25) does not match the aspect ratio at which it was displayed (1280/960 = 1.33). i.e. the pixels didn't just get scaled in size, but in shape.
So assuming your gaze coordinates were calibrated to match the physical screen (1280 × 960), then you will need to independently scale the x coordinates by 720/1280 = 0.5625 and the y coordinates by 576/960 = 0.6.
Note that this will distort the actual gaze behaviour (horizontal saccades are being scaled by more than vertical ones). Your safest option would actually be to rescale the video to have the same aspect ratio as the screen, and project the gaze coordinates onto that. That way, they won't be distorted, and the slightly skewed movie will match what was actually shown to the subjects.

reported vertical / horizontal camera angle of view is not consistent with supported picture sizes. Which gets clipped or filled on display?

The reported vertical / horizontal camera angle of view is not consistent with supported picture sizes. Presumably this discrepancy is resolved by filling / clipping before an image is returned in the onPicture() callback. How is it resolved? I would like to correctly measure angles by processing the picture.
Actually, the angles of view are consistent with some of the supported picture sizes. In the
case of an Samsung Captivate, the angles are reported as 51.2 x 39.4. Taking the ratio of the tangents of these angles divided by 2, you get 1.338. This agrees closely enough with the 640x480, 1600x1200, 2048x1536, and 2560x1920 aspect ratios: 1.333...
Additionally, the angles do not change either when you change the picture size, or the zoom, so they are describing the hardware, specifically a relationship between lens and sensor.
So my question only applies to the other picture sizes, having an aspect ratio of 1.66....

How to change 3 meter to pixel distance?

I have a 3D object and I need to project it relative to a 2D image, which is captured 3m away with a camera. When I tried to make a projection matrix I found that I need to state the camera position from the object (3m) and the height of the Camera above the ground (1m). Thus, I need to change these values, measured in meters, to pixels so that they can be used in a projection matrix.
I need to do the computation in Matlab. Any pointers?
get(0,'ScreenPixelsPerInch')
Gives you the missing ingredient in Quentin's solution.
1 metre = 39.3700787 inches so dpi * distance_in_meters / 39.3700787
Obviously you will need to know the DPI of the output device.