First question:
If I create a Container widget with width set to 50 (logical pixels), how many physical pixels will this widget occupy eventually?
It is correct to assume the answer will be 50*devicePixelRatio and it will be rounded off to an integer value?
Second question:
This question technically arises from the first question. Let say we have 2 devices and each has the same screen width , same resolution but different screen height. In theory, the width value in logical pixels should be the same for both devices right.
However, one devicePixelRatio will be higher. Hence, the eventual width value in physical pixels will be different and this might cause one of the Container to overflow since my assumption is that the eventual Container widget width will be multiplied by the devicePixelRatio.
It is correct to assume the answer will be 50*devicePixelRatio and it will be rounded off to an integer value?
Yes
This question technically arises from the first question. Let say we have 2 devices and each has the same screen width , same resolution but different screen height. In theory, the width value in logical pixels should be the same for both devices right. However, one devicePixelRatio will be higher. Hence, the eventual width value in physical pixels will be different and this might cause one of the Container to overflow since my assumption is that the eventual Container widget width will be multiplied by the devicePixelRatio.
Let's say
dvp = 1 for first device and dvp = 2 for second. So a width of 50 in first phone will be
1 * 50 = 50 device pixels
And for second device it would be
2 * 50 = 100 device pixels
But it does NOT mean Container width is going to be 2x on second phone and you may run in overflow error. Not at all!!!
Actually in first phone 50 device pixels are used to create 50 logical pixels box(less sharpness) and in second phone 100 device pixels are used to create 50 logical pixels box(more sharpness)
Related
How to get the screen's (or pixel's) physical width and height in flutter?
is it possible?
I need to display exactly 1 cm on different screens programaticly
Thanks.
You can use:
MediaQuery.of(context).size
/// The size of the media in logical pixels (e.g, the size of the screen).
///
/// Logical pixels are roughly the same visual size across devices. Physical
/// pixels are the size of the actual hardware pixels on the device. The
/// number of physical pixels per logical pixel is described by the
/// [devicePixelRatio].
Given any screen resolution, is there a way that I can figure out the amount of points in an inch? For instance, if I wanted to create an NSView that was 8.5 inches by 11 inches (like a sheet of a paper), is there an algorithm that will allow me to obtain the correct point values for the frame across many different types of Macs and screen resolutions?
It's not straightforward. I'm not sure there's a good way. I can provide an approach, but I haven't confirmed that this works reliably:
First, you can use CGDisplayScreenSize() to get the screen's physical size in millimeters. You can obtain the CGDirectDisplayID for a screen from NSScreen, which you can, in turn, get from the window. Obtain the screen's deviceDescription and get the value for the "NSScreenNumber" key. That may need to be cast to CGDirectDisplayID.
The problem from there is that the display mode may not fill the screen. It could be letterboxed or pillarboxed. Or, it might be stretched. This should be fairly uncommon these days, but still possible. You can obtain the display mode using CGDisplayCopyDisplayMode(). To determine if it's stretched, you can examine its ioFlags to see if they contain the bitmask kDisplayModeStretchedFlag (declared in IOKit).
If it's stretched, the screen's frame will have to be mapped to its size in millimeters separately for the X and Y axes. You assume the screen's frame.width (in points) maps to the full physical width, and similarly for the height.
If the mode is not stretched, you'll have to check the aspect ratio of the frame and the screen physical size to see if it's letter- or pillarboxed. If the aspect ratios are very close, then it's presumably not. That case is similar to the stretched case, but the width and height mappings should be equivalent.
If the aspect ratios differ significantly, then you compare them. If the screen's physical aspect ratio is larger than the frame's, then the screen is physically wider than the mode is using (pillarboxed). So, you compute the mapping from points to millimeters from the two heights. If the physical aspect ratio is smaller than the logical one, then the mode is letterboxed and you use the widths to compute the mapping.
I am building a flutter application that needs precise measurements of the screen in cm / inches.
According to the docs,
By definition, there are roughly 38 logical pixels per centimeter, or about 96 logical pixels per inch, of the physical display. The value returned by devicePixelRatio is ultimately obtained either from the hardware itself, the device drivers, or a hard-coded value stored in the operating system or firmware, and may be inaccurate, sometimes by a significant margin.
I have tried using these ratios in my application but they are not even close.
Is there a way to accurately calculate the dimensions of the screen?
Flutter's pixel coordinates are given in logical pixels rather than physical pixels. However, MediaQuery will give you the conversion ratio.
var mediaQuery = MediaQuery.of(context);
var physicalPixelWidth = mediaQuery.size.width * mediaQuery.devicePixelRatio;
var physicalPixelHeight = mediaQuery.size.height * mediaQuery.devicePixelRatio;
(Note that this code can only be run in a place where a BuildContext object is available, such as a build method or a StatefulWidget's companion State class.)
in my Flutter application I am trying to get the real screen width (that can naturally be different on each device).
I am using MediaQuery.of(context).size.width but I've noticed that the values returned do not match the real screen resolution.
For instance,
On an simulator iPhone 11 Pro Max (that has resolution 2688 x 1242) I get MediaQuery.of(context).size.width= 414
On an emulator Nexus XL (that has resolution 1440 x 2560) I get MediaQuery.of(context).size.width = 411.42857142857144
On a real device iPhone 7 (that has resolution 1,334 x 750) I get MediaQuery.of(context).size.width = 375
Does anyone know why the value returned by MediaQuery differ from the real screen resolution in pixels?
Thanks
According to the size property's documentation :
The size of the media in logical pixels (e.g, the size of the screen).
Logical pixels are roughly the same visual size across devices.
Physical pixels are the size of the actual hardware pixels on the
device. The number of physical pixels per logical pixel is described
by the devicePixelRatio.
So you would do MediaQuery.of(context).size.width * MediaQuery.of(context).devicePixelRatioto get the width in physical pixels.
While trying to create a repeated tile overlay, I've found many questions (like this one)
mentioning that repeated images in Cocos2d must have height and width dimensions that are powers of two.
This raises two questions. First, why is this a limitation? Second, and more importantly, how can I create a repeating, scrolling image that has dimensions that are not a power of two? What if I have a really wide background (say 4000 pixels) and I want it to repeat across the X axis. What should I do in that context? I can't believe the "correct" answer is to add an additional 96 pixels to the width, and increase the height of the image to 4096, as well. That's wasted bytes!
This answer has excellent info on why the need for power of 2 textures.
Why do images for textures on the iPhone need to have power-of-two dimensions?
As for your second question, the texture does not have to be square, just both the width and height have to be a power of 2. So you could have an image that is 4096x128 repeating as your background. Keep in mind also that textures, no matter what the size, are always stored in memory in an uncompressed power of two size. So an image with width of 4000 and and an image with width of 4096 are actually using the same amount of memory.