Get system UIScale factor using gtk3 APIs - gtk

I am working on a SDK which uses gtk3 for windowing and event handling. I am trying to find the system UIScale using the gtk3 APIs. The scale can be greater than 1.0 on HiDPI screens. I was hoping that gtk_window_get_scale_factor will return the correct scale factor, but it is always returning 1, even if I have set the scale to 2.0.
I know that gtk has knowledge about the scale factor as gtk native apps are getting scaled properly. I have ran gtk3-demo and it is getting scaled properly depending upon the system UIScale. Can someone please help me to get the scaling factor using gtk3 APIs?
PS: The scale is set using these steps. Go to "Setting->Displays->Scale" and make scale 2.0. I am testing this on fedora 32 and Ubuntu 20.04.

If what you are looking for is the DPI scale (usually set by the GDK_DPI_SCALE environment variable), I would use a variant the following code snippet:
static gdouble get_dpi_scale(GdkScreen *screen)
{
gdouble dpi = gdk_screen_get_resolution(screen);
GtkSettings *settings = gtk_settings_get_default();
gint xft_dpi = 0;
g_object_get(settings, "gtk-xft-dpi", &xft_dpi, NULL);
g_return_val_if_fail(xft_dpi > 0, 0);
return dpi / xft_dpi;
}

Related

cross-platform techniques to determine screen dimensions

Is there a good cross-platform way of getting the screen dimensions?
Frequently I do this with PerlTk:
use Tk;
my $mw = MainWindow->new;
my $screen_width = $mw->screenwidth();
my $screen_height = $mw->screenheight();
But it'd be better to not have to load all of Tk just to do this.
This looks like a good X11 specific way of doing these things (GetRootWindow should work for screen dimensions):
Perl: Getting the geometry of a window with X11 WindowID
But I think a cross-platform approach would be better.
Specifically, I'm looking for ways to determine the monitor dimensions in pixels, which is what Tk's screenwidth and screenheight return.
On most POSIX-y systems:
use Curses ();
my $screen_width = $Curses::COLS;
my $screen_height = $Curses::LINES;
These values don't update automatically when the screen is resized.
The best I can see for getting display/screen resolution is to combine OS-specific tools.
On X11 the best bet is probably xrandr† while on Windows it'd be Win32::GUI or Win32::API.
Then wrap it in a sub that checks for the OS (using $^O or such) and selects a tool to use.
(Or of course use a GUI package, like Perl/Tk that OP is using)
† For example
for my $line_out ( qx(xrandr) ) {
#print "--> $line_out";
if ( my ($wt, $ht) = $line_out =~ /^\s+([0-9]+)\s*x\s*([0-9]+)/ ) {
say "Width: $wt, height: $ht";
last;
}
}
There are many ways to parse that, and in principle I recommend using libraries to manage external programs (in particular in order to get their output/errors etc); this is a quick demo.
On my system the xrandr output is like
Screen 0: minimum 8 x 8, current 5040 x 1920, maximum 32767 x 32767
DP-0 connected primary 1920x1200+3120+418 (normal left inverted right x axis y axis) 519mm x 320mm
1920x1200 59.95*+
1600x1200 60.00
1280x1024 75.02 60.02
1152x864 75.00
...
and it keeps going, and then for all displays.
So I pick the top resolution for the first listed one ("primary")

Is there a maximum value for the volume depth of a render texture?

I'm using render texture as an array of textures by specifying the size of the array in volume depth property. But, sometimes when I exceed some value (eg. for 128x128 textures it's 45...) it return me an error : D3D11: Failed to create RenderTexture (128 x 128 fmt 39 aa 1), error 0x80070057 which isn't very clear. Therefore, I supposed it's because this property has a maximum value ? But I did not find it in unity manual either on internet.
Does anyone know this value or could tell me where I could find it ?
The width, height, and depth must be equal to or less than D3D11_REQ_TEXTURE3D_U_V_OR_W_DIMENSION (2048).
Likely you are having issues with some other parameter. Try enabling the Direct3D Debug Device for better information. Use -force-d3d11-debug. With Windows 10 or Windows 11, you have to install it by enabling the Windows optional feature Graphics Tools.
See Microsoft Docs.

Can Flutter render images from raw pixel data? [duplicate]

Setup
I am using a custom RenderBox to draw.
The canvas object in the code below comes from the PaintingContext in the paint method.
Drawing
I am trying to render pixels individually by using Canvas.drawRect.
I should point out that these are sometimes larger and sometimes smaller than the pixels on screen they actually occupy.
for (int i = 0; i < width * height; i++) {
// in this case the rect size is 1
canvas.drawRect(
Rect.fromLTWH(index % (width * height),
(index / (width * height)).floor(), 1, 1), Paint()..color = colors[i]);
}
Storage
I am storing the pixels as a List<List<Color>> (colors in the code above). I tried differently nested lists previously, but they did not cause any noticable discrepancies in terms of performance.
The memory on my Android Emulator test device increases by 282.7MB when populating the list with a 999x999 image. Note that it only temporarily increases by 282.7MB. After about half a minute, the increase drops to 153.6MB and stays there (without any user interaction).
Rendering
With a resolution of 999x999, the code above causes a GPU max of 250.1 ms/frame and a UI max of 1835.9 ms/frame, which is obviously unacceptable. The UI freezes for two seconds when trying to draw a 999x999 image, which should be a piece of cake (I would guess) considering that 4k video runs smoothly on the same device.
CPU
I am not exactly sure how to track this properly using the Android profiler, but while populating or changing the list, i.e. drawing the pixels (which is the case for the above metrics as well), CPU usage goes from 0% to up to 60%. Here are the AVD performance settings:
Cause
I have no idea where to start since I am not even sure what part of my code causes the freezing. Is it the memory usage? Or the drawing itself?
How would I go about this in general? What am I doing wrong? How should I store these pixels instead.
Efforts
I have tried so much that did not help at all that I will try to only point out the most notable ones:
I tried converting the List<List<Color>> to an Image from the dart:ui library hoping to use Canvas.drawImage. In order to do that, I tried encoding my own PNG, but I have not been able to render more than a single row. However, it did not look like that would boost performance. When trying to convert a 9999x9999 image, I ran into an out of memory exception. Now, I am wondering how video is rendered as all as any 4k video will easily take up more memory than a 9999x9999 image if a few seconds of it are in memory.
I tried implementing the image package. However, I stopped before completing it as I noticed that it is not meant to be used in Flutter but rather in HTML. I would not have gained anything using that.
This one is pretty important for the following conclusion I will draw: I tried to just draw without storing the pixels, i.e. is using Random.nextInt to generate random colors. When trying to randomly generate a 999x999 image, this resulted in a GPU max of 1824.7 ms/frames and a UI max of 2362.7 ms/frame, which is even worse, especially in the GPU department.
Conclusion
This is the conclusion I reached before trying my failed attempt at rendering using Canvas.drawImage: Canvas.drawRect is not made for this task as it cannot even draw simple images.
How do you do this in Flutter?
Notes
This is basically what I tried to ask over two months ago (yes, I have been trying to resolve this issue for that long), but I think that I did not express myself properly back then and that I knew even less what the actual problem was.
The highest resolution I can properly render is around 10k pixels. I need at least 1m.
I am thinking that abandoning Flutter and going for native might be my only option. However, I would like to believe that I am just approaching this problem completely wrong. I have spent about three months trying to figure this out and I did not find anything that lead me anywhere.
Solution
dart:ui has a function that converts pixels to an Image easily: decodeImageFromPixels
Example implementation
Issue on performance
Does not work in the current master channel
I was simply not aware of this back when I created this answer, which is why I wrote the "Alternative" section.
Alternative
Thanks to #pslink for reminding me of BMP after I wrote that I had failed to encode my own PNG.
I had looked into it previously, but I thought that it looked to complicated without sufficient documentation. Now, I found this nice article explaining the necessary BMP headers and implemented 32-bit BGRA (ARGB but BGRA is the order of the default mask) by copying Example 2 from the "BMP file format" Wikipedia article. I went through all sources but could not find an original source for this example. Maybe the authors of the Wikipedia article wrote it themselves.
Results
Using Canvas.drawImage and my 999x999 pixels converted to an image from a BMP byte list, I get a GPU max of 9.9 ms/frame and a UI max of 7.1 ms/frame, which is awesome!
| ms/frame | Before (Canvas.drawRect) | After (Canvas.drawImage) |
|-----------|---------------------------|--------------------------|
| GPU max | 1824.7 | 9.9 |
| UI max | 2362.7 | 7.1 |
Conclusion
Canvas operations like Canvas.drawRect are not meant to be used like that.
Instructions
First of, this is quite straight-forward, however, you need to correctly populate the byte list, otherwise, you are going to get an error that your data is not correctly formatted and see no results, which can be quite frustrating.
You will need to prepare your image before drawing as you cannot use async operations in the paint call.
In code, you need to use a Codec to transform your list of bytes into an image.
final list = [
0x42, 0x4d, // 'B', 'M'
...];
// make sure that you either know the file size, data size and data offset beforehand
// or that you edit these bytes afterwards
final Uint8List bytes = Uint8List.fromList(list);
final Codec codec = await instantiateImageCodec(bytes));
final Image image = (await codec.getNextFrame()).image;
You need to pass this image to your drawing widget, e.g. using a FutureBuilder.
Now, you can just use Canvas.drawImage in your draw call.

(re)detect number of monitors connected when disconnecting monitor

When multiple monitors are connected to my computer, I can detect them, and draw figures to them by setting the position according to the values obtained from
get(0, 'MonitorPositions')
However, when I disconnect a monitor while MATLAB is running, this property is not updated. I use distFig to handle the positioning of the figures, but since this property is not updated, sometimes the figures are drawn at the pixel locations that lay outside my screen (i.e. drawing on my disconnected monitor).
Restarting MATLAB solves the issue, but is there a way to re-detect the number of monitors connected?
I think I found a solution using JAVA:
I got the JAVA code from here: How do I get number of available screens?
Getting number of
get(0, 'MonitorPositions') keeps showing the same value, and the JAVA result changes:
%// Get local graphics environment
%GraphicsEnvironment env = GraphicsEnvironment.getLocalGraphicsEnvironment();
env = java.awt.GraphicsEnvironment.getLocalGraphicsEnvironment();
%// Returns an array of all of the screen GraphicsDevice objects.
%GraphicsDevice[] devices = env.getScreenDevices();
devices = env.getScreenDevices();
%numberOfScreens = devices.length;
numberOfScreens = length(devices)
I tested the code in Windows 10 OS.
In monitor duplicate mode, result is one monitor, and in extended mode 2.
When I unplug a monitor, the result is 1.
When unplugging all monitors the result is also 1 (it's not a perfect solution).

Get dpi settings via GTK

Using GTK, how do I query the current screen's dpi settings?
The current accepted answer is for PHPGTK, which feels a bit odd to me. The pure GDK library has this call: gdk_screen_get_resolution(), which sounds like a better match. Haven't worked with it myself, don't know if it's generally reliable.
The resolution height and width returned by screen includes the full multi-monitor sizes (e.g. combined width and length of the displayer buffer used to render multi-monitor setup). I've not check of the mm (millimeter width/height) calls returns the actual physical sizes but if it report combined physical sizes then the dpi computed from dividing one with another would be meaningless, e.g. to draw a box on screen that can be measured using a physical ruler.
See GdkScreen. You should be able to compute it using the get_height and get_height_mm or with get_width and get_width_mm.