I am building a flutter application that needs precise measurements of the screen in cm / inches.
According to the docs,
By definition, there are roughly 38 logical pixels per centimeter, or about 96 logical pixels per inch, of the physical display. The value returned by devicePixelRatio is ultimately obtained either from the hardware itself, the device drivers, or a hard-coded value stored in the operating system or firmware, and may be inaccurate, sometimes by a significant margin.
I have tried using these ratios in my application but they are not even close.
Is there a way to accurately calculate the dimensions of the screen?
Flutter's pixel coordinates are given in logical pixels rather than physical pixels. However, MediaQuery will give you the conversion ratio.
var mediaQuery = MediaQuery.of(context);
var physicalPixelWidth = mediaQuery.size.width * mediaQuery.devicePixelRatio;
var physicalPixelHeight = mediaQuery.size.height * mediaQuery.devicePixelRatio;
(Note that this code can only be run in a place where a BuildContext object is available, such as a build method or a StatefulWidget's companion State class.)
Related
I am not interested in inside workings of logical pixels, I just want to know if flutter automatically use logical pixel
Container(
width:100,
child:...
)
Does flutter uses 100 pixel or logical pixel as width here, I can't figure it out.
You can print screen width
double kScreenWidth(BuildContext ctx) => MediaQuery.of(ctx).size.width;
you can see what it is like
Container(width: 100, ...)
logical pixels
So obvious,
What you see is what you got.
Flutter follows a simple density-based format like iOS. Assets might be 1.0x, 2.0x, 3.0x, or any other multiplier.
Flutter doesn’t have dps but there are logical pixels, which are basically the same as device-independent pixels. The so-called devicePixelRatio expresses the ratio of physical pixels in a single logical pixel.
from flutter dev doc
Given any screen resolution, is there a way that I can figure out the amount of points in an inch? For instance, if I wanted to create an NSView that was 8.5 inches by 11 inches (like a sheet of a paper), is there an algorithm that will allow me to obtain the correct point values for the frame across many different types of Macs and screen resolutions?
It's not straightforward. I'm not sure there's a good way. I can provide an approach, but I haven't confirmed that this works reliably:
First, you can use CGDisplayScreenSize() to get the screen's physical size in millimeters. You can obtain the CGDirectDisplayID for a screen from NSScreen, which you can, in turn, get from the window. Obtain the screen's deviceDescription and get the value for the "NSScreenNumber" key. That may need to be cast to CGDirectDisplayID.
The problem from there is that the display mode may not fill the screen. It could be letterboxed or pillarboxed. Or, it might be stretched. This should be fairly uncommon these days, but still possible. You can obtain the display mode using CGDisplayCopyDisplayMode(). To determine if it's stretched, you can examine its ioFlags to see if they contain the bitmask kDisplayModeStretchedFlag (declared in IOKit).
If it's stretched, the screen's frame will have to be mapped to its size in millimeters separately for the X and Y axes. You assume the screen's frame.width (in points) maps to the full physical width, and similarly for the height.
If the mode is not stretched, you'll have to check the aspect ratio of the frame and the screen physical size to see if it's letter- or pillarboxed. If the aspect ratios are very close, then it's presumably not. That case is similar to the stretched case, but the width and height mappings should be equivalent.
If the aspect ratios differ significantly, then you compare them. If the screen's physical aspect ratio is larger than the frame's, then the screen is physically wider than the mode is using (pillarboxed). So, you compute the mapping from points to millimeters from the two heights. If the physical aspect ratio is smaller than the logical one, then the mode is letterboxed and you use the widths to compute the mapping.
I have images stored as blobs in SQLite. Other tools like DB Browser for SQLite show the images themselves are not upscaled.
I scaled them down from an original image with the following code.
final thumbnailData = encodeJpg(copyResize(
decodeImage(imageData),
width: 400,
interpolation: Interpolation.average
));
When displayed in Flutter they are noticably upscaled.
#override
Widget build(BuildContext context) {
return Image.memory(_getThumbnailData());
}
Image.memory() has a scale argument that defaults to 1.0. Setting it manually to be sure doesn't help either.
I have to set it to some guesstimated value like 2.0 to get the correct scale but I don't understand why and wether 2.0 is actually "unscaled" or still slightly off.
How can I tell Flutter to display the images as they are?
Flutter uses logical pixel instead of physical pixels.
Device pixels are also referred to as physical pixels. Logical pixels are also referred to as device-independent or resolution-independent pixels.
How to convert between physical pixels and logical pixels?
To convert between physical pixels and logical pixels, you can use devicePixelRatio.
The number of device pixels for each logical pixel. This number might not be a power of two. Indeed, it might not even be an integer. For example, the Nexus 6 has a device pixel ratio of 3.5.
MediaQuery.of(context).devicePixelRatio
in my Flutter application I am trying to get the real screen width (that can naturally be different on each device).
I am using MediaQuery.of(context).size.width but I've noticed that the values returned do not match the real screen resolution.
For instance,
On an simulator iPhone 11 Pro Max (that has resolution 2688 x 1242) I get MediaQuery.of(context).size.width= 414
On an emulator Nexus XL (that has resolution 1440 x 2560) I get MediaQuery.of(context).size.width = 411.42857142857144
On a real device iPhone 7 (that has resolution 1,334 x 750) I get MediaQuery.of(context).size.width = 375
Does anyone know why the value returned by MediaQuery differ from the real screen resolution in pixels?
Thanks
According to the size property's documentation :
The size of the media in logical pixels (e.g, the size of the screen).
Logical pixels are roughly the same visual size across devices.
Physical pixels are the size of the actual hardware pixels on the
device. The number of physical pixels per logical pixel is described
by the devicePixelRatio.
So you would do MediaQuery.of(context).size.width * MediaQuery.of(context).devicePixelRatioto get the width in physical pixels.
First question:
If I create a Container widget with width set to 50 (logical pixels), how many physical pixels will this widget occupy eventually?
It is correct to assume the answer will be 50*devicePixelRatio and it will be rounded off to an integer value?
Second question:
This question technically arises from the first question. Let say we have 2 devices and each has the same screen width , same resolution but different screen height. In theory, the width value in logical pixels should be the same for both devices right.
However, one devicePixelRatio will be higher. Hence, the eventual width value in physical pixels will be different and this might cause one of the Container to overflow since my assumption is that the eventual Container widget width will be multiplied by the devicePixelRatio.
It is correct to assume the answer will be 50*devicePixelRatio and it will be rounded off to an integer value?
Yes
This question technically arises from the first question. Let say we have 2 devices and each has the same screen width , same resolution but different screen height. In theory, the width value in logical pixels should be the same for both devices right. However, one devicePixelRatio will be higher. Hence, the eventual width value in physical pixels will be different and this might cause one of the Container to overflow since my assumption is that the eventual Container widget width will be multiplied by the devicePixelRatio.
Let's say
dvp = 1 for first device and dvp = 2 for second. So a width of 50 in first phone will be
1 * 50 = 50 device pixels
And for second device it would be
2 * 50 = 100 device pixels
But it does NOT mean Container width is going to be 2x on second phone and you may run in overflow error. Not at all!!!
Actually in first phone 50 device pixels are used to create 50 logical pixels box(less sharpness) and in second phone 100 device pixels are used to create 50 logical pixels box(more sharpness)