Requested texture dimension exceeds maximum texture size - javafx-8

In JavaFx, I have a program in which the renderthread sometimes crashes with a message that the requested texture dimension is too large. If I read the stacktrace correctly I guess it happens in an NGCanvas (which probably stands for native graphics canvas, the implementation side of the JavaFx Canvas node).
I checked the sizes of the canvases I allocate and none come close to this size. Now, because JavaFx runs its rendering in its own thread, I don't even know who created this.
Is there any way I can find out who tells the renderpipeline to allocate such large texture ?
java.lang.RuntimeException: Requested texture dimension (65824) requires dimension (0) that exceeds maximum texture size (16384)
at com.sun.prism.es2.ES2RTTexture.getCompatibleDimension(ES2RTTexture.java:135)
at com.sun.prism.es2.ES2ResourceFactory.getRTTWidth(ES2ResourceFactory.java:146)
at com.sun.scenario.effect.impl.prism.ps.PPSDrawable.getCompatibleWidth(PPSDrawable.java:48)
at com.sun.scenario.effect.impl.prism.ps.PPSRenderer.getCompatibleWidth(PPSRenderer.java:153)
at com.sun.scenario.effect.impl.ImagePool.checkOut(ImagePool.java:119)
at com.sun.scenario.effect.impl.Renderer.getCompatibleImage(Renderer.java:116)
at com.sun.scenario.effect.impl.prism.ps.PPSRenderer.getCompatibleImage(PPSRenderer.java:168)
at com.sun.scenario.effect.impl.prism.ps.PPSRenderer.getCompatibleImage(PPSRenderer.java:67)
at com.sun.scenario.effect.Effect.getCompatibleImage(Effect.java:479)
at com.sun.javafx.sg.prism.NGCanvas$RenderInput.filter(NGCanvas.java:1582)
at com.sun.scenario.effect.FilterEffect.filter(FilterEffect.java:185)
at com.sun.javafx.sg.prism.NGCanvas.applyEffectOnAintoC(NGCanvas.java:737)
at com.sun.javafx.sg.prism.NGCanvas.renderStream(NGCanvas.java:1080)
at com.sun.javafx.sg.prism.NGCanvas.renderContent(NGCanvas.java:606)
at com.sun.javafx.sg.prism.NGNode.doRender(NGNode.java:2053)
at com.sun.javafx.sg.prism.NGNode.render(NGNode.java:1945)
at com.sun.javafx.sg.prism.NGGroup.renderContent(NGGroup.java:235)
at com.sun.javafx.sg.prism.NGRegion.renderContent(NGRegion.java:576)
at com.sun.javafx.sg.prism.NGNode.doRender(NGNode.java:2053)
at com.sun.javafx.sg.prism.NGNode.render(NGNode.java:1945)
at com.sun.javafx.tk.quantum.ViewPainter.doPaint(ViewPainter.java:477)
at com.sun.javafx.tk.quantum.ViewPainter.paintImpl(ViewPainter.java:330)
at com.sun.javafx.tk.quantum.PresentingPainter.run(PresentingPainter.java:91)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:308)
at com.sun.javafx.tk.RenderJob.run(RenderJob.java:58)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at com.sun.javafx.tk.quantum.QuantumRenderer$PipelineRunnable.run(QuantumRenderer.java:125)
at java.lang.Thread.run(Thread.java:745)
As per request: the environment I use is linux debian amd64, jdk 1.8 update 66. The rendering happens through mesa.

In javaFx only the javaFx application thread is allowed to modify the canvas.
Try using Platform.runLater() to pack your draw call onto the graphicsContext into a new Runnable.

Related

Is there a maximum value for the volume depth of a render texture?

I'm using render texture as an array of textures by specifying the size of the array in volume depth property. But, sometimes when I exceed some value (eg. for 128x128 textures it's 45...) it return me an error : D3D11: Failed to create RenderTexture (128 x 128 fmt 39 aa 1), error 0x80070057 which isn't very clear. Therefore, I supposed it's because this property has a maximum value ? But I did not find it in unity manual either on internet.
Does anyone know this value or could tell me where I could find it ?
The width, height, and depth must be equal to or less than D3D11_REQ_TEXTURE3D_U_V_OR_W_DIMENSION (2048).
Likely you are having issues with some other parameter. Try enabling the Direct3D Debug Device for better information. Use -force-d3d11-debug. With Windows 10 or Windows 11, you have to install it by enabling the Windows optional feature Graphics Tools.
See Microsoft Docs.

MRTK - Maximum number of 64 colliders found in PokePointer overlap query

I am trying to build a color selection list in my personal project, with 48 * PressableButtonHoloLens2 + GridObjectCollection. When I run and hover with the simulated fingertip, the editor gives me these warning messages.
Q1: Is this because too many buttons are too close to each other? Or just the number of the buttons with collider is over 64? The message says 'Consider increasing the query buffer size in the pointer profile'
Q2: Where can I increase the buffer size? I don't see any 'Buffer size' field in the pointer profile.
Q3: Would it decrease performance? (increasing the buffer size)
Warning message
Maximum number of 64 colliders found in PokePointer overlap query.
Consider increasing the query buffer size in the pointer profile.
UnityEngine.Debug:LogWarning(Object)
Microsoft.MixedReality.Toolkit.Input.PokePointer:FindClosestTouchableForLayerMask(LayerMask,
BaseNearInteractionTouchable&, Single&, Vector3&) (at
Assets/MixedRealityToolkit.SDK/Features/UX/Scripts/Pointers/PokePointer.cs:169)
Microsoft.MixedReality.Toolkit.Input.PokePointer:OnPreSceneQuery() (at
Assets/MixedRealityToolkit.SDK/Features/UX/Scripts/Pointers/PokePointer.cs:127)
Microsoft.MixedReality.Toolkit.Input.FocusProvider:UpdatePointer(PointerData)
(at
Assets/MixedRealityToolkit.Services/InputSystem/FocusProvider.cs:878)
Microsoft.MixedReality.Toolkit.Input.FocusProvider:UpdatePointers()
(at
Assets/MixedRealityToolkit.Services/InputSystem/FocusProvider.cs:841)
Microsoft.MixedReality.Toolkit.Input.FocusProvider:Update() (at
Assets/MixedRealityToolkit.Services/InputSystem/FocusProvider.cs:518)
Microsoft.MixedReality.Toolkit.<>c:b__60_0(IMixedRealityService)
(at Assets/MixedRealityToolkit/Services/MixedRealityToolkit.cs:880)
Microsoft.MixedReality.Toolkit.MixedRealityToolkit:ExecuteOnAllServices(IEnumerable1,
Action1) (at
Assets/MixedRealityToolkit/Services/MixedRealityToolkit.cs:969)
Microsoft.MixedReality.Toolkit.MixedRealityToolkit:ExecuteOnAllServicesInOrder(Action`1)
(at Assets/MixedRealityToolkit/Services/MixedRealityToolkit.cs:950)
Microsoft.MixedReality.Toolkit.MixedRealityToolkit:UpdateAllServices()
(at Assets/MixedRealityToolkit/Services/MixedRealityToolkit.cs:880)
Microsoft.MixedReality.Toolkit.MixedRealityToolkit:Update() (at
Assets/MixedRealityToolkit/Services/MixedRealityToolkit.cs:580)
To reproduce
Create an empty game object
Put 48 x PressableButtonHoloLens2 prefabs under it
Assign GridObjectCollection to the parent
Update layout (cell width x height = 0.032)
Run and hover with simulated hand.
Expected behavior
No warning messages
Your Setup (please complete the following information)
Unity Version [e.g. 2018.4.6f1]
MRTK Version [e.g. v2.0.0]
https://github.com/microsoft/MixedRealityToolkit-Unity/issues/6052
Q1: Is this because too many buttons are too close to each other? Or just the number of the buttons with collider is over 64? The message says 'Consider increasing the query buffer size in the pointer profile'
It is because there are too many buttons close to each other.
Q2: Where can I increase the buffer size? I don't see any 'Buffer size' field in the pointer profile.
You can do this in the PokePointer prefab, in the PokePointer script, look for "Scene Query Buffer Size" field.
Q3: Would it decrease performance? (increasing the buffer size)
Yes I anticipate it would, though unclear how much relative to other components in the scene. Note that the poke pointer does run queries every frame, at least one per hand.

Can Flutter render images from raw pixel data? [duplicate]

Setup
I am using a custom RenderBox to draw.
The canvas object in the code below comes from the PaintingContext in the paint method.
Drawing
I am trying to render pixels individually by using Canvas.drawRect.
I should point out that these are sometimes larger and sometimes smaller than the pixels on screen they actually occupy.
for (int i = 0; i < width * height; i++) {
// in this case the rect size is 1
canvas.drawRect(
Rect.fromLTWH(index % (width * height),
(index / (width * height)).floor(), 1, 1), Paint()..color = colors[i]);
}
Storage
I am storing the pixels as a List<List<Color>> (colors in the code above). I tried differently nested lists previously, but they did not cause any noticable discrepancies in terms of performance.
The memory on my Android Emulator test device increases by 282.7MB when populating the list with a 999x999 image. Note that it only temporarily increases by 282.7MB. After about half a minute, the increase drops to 153.6MB and stays there (without any user interaction).
Rendering
With a resolution of 999x999, the code above causes a GPU max of 250.1 ms/frame and a UI max of 1835.9 ms/frame, which is obviously unacceptable. The UI freezes for two seconds when trying to draw a 999x999 image, which should be a piece of cake (I would guess) considering that 4k video runs smoothly on the same device.
CPU
I am not exactly sure how to track this properly using the Android profiler, but while populating or changing the list, i.e. drawing the pixels (which is the case for the above metrics as well), CPU usage goes from 0% to up to 60%. Here are the AVD performance settings:
Cause
I have no idea where to start since I am not even sure what part of my code causes the freezing. Is it the memory usage? Or the drawing itself?
How would I go about this in general? What am I doing wrong? How should I store these pixels instead.
Efforts
I have tried so much that did not help at all that I will try to only point out the most notable ones:
I tried converting the List<List<Color>> to an Image from the dart:ui library hoping to use Canvas.drawImage. In order to do that, I tried encoding my own PNG, but I have not been able to render more than a single row. However, it did not look like that would boost performance. When trying to convert a 9999x9999 image, I ran into an out of memory exception. Now, I am wondering how video is rendered as all as any 4k video will easily take up more memory than a 9999x9999 image if a few seconds of it are in memory.
I tried implementing the image package. However, I stopped before completing it as I noticed that it is not meant to be used in Flutter but rather in HTML. I would not have gained anything using that.
This one is pretty important for the following conclusion I will draw: I tried to just draw without storing the pixels, i.e. is using Random.nextInt to generate random colors. When trying to randomly generate a 999x999 image, this resulted in a GPU max of 1824.7 ms/frames and a UI max of 2362.7 ms/frame, which is even worse, especially in the GPU department.
Conclusion
This is the conclusion I reached before trying my failed attempt at rendering using Canvas.drawImage: Canvas.drawRect is not made for this task as it cannot even draw simple images.
How do you do this in Flutter?
Notes
This is basically what I tried to ask over two months ago (yes, I have been trying to resolve this issue for that long), but I think that I did not express myself properly back then and that I knew even less what the actual problem was.
The highest resolution I can properly render is around 10k pixels. I need at least 1m.
I am thinking that abandoning Flutter and going for native might be my only option. However, I would like to believe that I am just approaching this problem completely wrong. I have spent about three months trying to figure this out and I did not find anything that lead me anywhere.
Solution
dart:ui has a function that converts pixels to an Image easily: decodeImageFromPixels
Example implementation
Issue on performance
Does not work in the current master channel
I was simply not aware of this back when I created this answer, which is why I wrote the "Alternative" section.
Alternative
Thanks to #pslink for reminding me of BMP after I wrote that I had failed to encode my own PNG.
I had looked into it previously, but I thought that it looked to complicated without sufficient documentation. Now, I found this nice article explaining the necessary BMP headers and implemented 32-bit BGRA (ARGB but BGRA is the order of the default mask) by copying Example 2 from the "BMP file format" Wikipedia article. I went through all sources but could not find an original source for this example. Maybe the authors of the Wikipedia article wrote it themselves.
Results
Using Canvas.drawImage and my 999x999 pixels converted to an image from a BMP byte list, I get a GPU max of 9.9 ms/frame and a UI max of 7.1 ms/frame, which is awesome!
| ms/frame | Before (Canvas.drawRect) | After (Canvas.drawImage) |
|-----------|---------------------------|--------------------------|
| GPU max | 1824.7 | 9.9 |
| UI max | 2362.7 | 7.1 |
Conclusion
Canvas operations like Canvas.drawRect are not meant to be used like that.
Instructions
First of, this is quite straight-forward, however, you need to correctly populate the byte list, otherwise, you are going to get an error that your data is not correctly formatted and see no results, which can be quite frustrating.
You will need to prepare your image before drawing as you cannot use async operations in the paint call.
In code, you need to use a Codec to transform your list of bytes into an image.
final list = [
0x42, 0x4d, // 'B', 'M'
...];
// make sure that you either know the file size, data size and data offset beforehand
// or that you edit these bytes afterwards
final Uint8List bytes = Uint8List.fromList(list);
final Codec codec = await instantiateImageCodec(bytes));
final Image image = (await codec.getNextFrame()).image;
You need to pass this image to your drawing widget, e.g. using a FutureBuilder.
Now, you can just use Canvas.drawImage in your draw call.

Caffe bvlc_googlenet minimum accepted dimensions

What is the minimum image input size accepted by bvlc_googlenet model implemented by Caffe?
I'm using 50 x 50 images with crop_size = 36, where i get the following error when running the solver:
caffe::Blob<>::Reshape() - Floating point exception
I have to resize my images to 256 x 256 (default input size of the bvlc_googlenet model) with crop_size = 224 to avoid the error.
Do this model only accept its default sizes or i have to hack around a bit to make it happen?
Thanks!!
After several hours of trying to fix the problem, i figured out why i was facing it.
GoogleNet accepts 224*224 images as input by default, so because it is so deep and after a set of convolution and pooling layers, using a 50*50 images (or 36*36 after crop) will lead into a very small sized output, after passing the input into some layers, smaller than the kernel size of the next layer. This will cause a Reshape exception similar to the one i faced here.
Solution:
Although its not preferred to edit the kernel_size param of the layer causing the exception (to keep working up to the NN's specifications), this will fix the problem, Where you could choose a smaller kernel size and then test the results until it works.
Follows the default GoogleNet's specifications by resizing your input images into 254*254 (keeping crop size to 224) or directly changing it to 224*224 and removing the crop_size param.

Objects only rendered in Left orthographic view

HOW THE Viewports look after loading the file:
I am having an issue caused by extremely large objects (in terms of physical size not polycount etc.) that I imported from a game using ninjaripper (a script used for extracting 3d models from games). When I open the file containing these large objects, the objects are only rendered in the left orthographic viewport. All other viewports/views do not show the geometry regardless of which rendering mode (wireframe, edges faces etc.) I have selected on said viewports. The objects are also not visible in perspective views. When I unhide all items apart from a single object (of normal size) I am able to see the object in all viewports including perspective viewports. When I unhide all again, the object which could previously be seen disappears. When switching to perspective view when these extremely large objects are present, the 'viewcube' disappears for an unknown reason. Zooming in or out in a perspective viewport also results in the viewcube disappearing. This is the only scene I've had so far which shows these issues, all my graphics drivers are up to date (specs listed below). The scene contains 3602 objects and has 1,957,286 polygons and 1,508,550 vertices.
This is the furthest I could zoom out in 3ds max:
Viewcube has disappeared on top right and bottom right viewport:
I tried removing all of the extremely large objects by hand, after which the remaining (normal sizes) objects could be seen in 2 of the viewports (top left and top right viewport did render correctly).
Viewports after having deleted all extremely large objects:
I tried resetting the scene, after which I merged the scene containing all 'normal sized objects' into an empty scene. This resulted in all viewports rendering the objects correctly. However, after saving the file and re-opening the saved file, 2 of the 4 viewports did not render the objects as was the case after just having deleted all but the 'normal sized' objects.
My question is: how should I deal with these extremely large imported objects in order to fix the viewport rendering issues they cause?
I wrote a simple bit of maxscript code to print out the maximum size of the biggest object in the scene, which resulted in a value of 2.6*10^38 [generic units], which, according to my calculation corresponds to a value of 6.6*10^36 [meters], in summary: extremely large. (I suspect the ninjaripper script or the script which imports the files produced by the ninjaripper into 3ds max had some sort of error causing some of the vertices to have extremely large position values). When I switch to the measure tap in 'utilities' and press Ctrl+A to select all objects in the scene (the scene containing all objects including the extremely large objects), 3ds max crashes due to the large object size (error message: "Application error- An error has occured and the application will now close. No Scene changes have occured since your last save.").
I could write some maxscript code which deletes all objects which are larger then a certain size (for example: 10^5 [meters]). However, as afore mentioned this for some reason does not fix the issue completely (after saving the scene with only 'normal sized' objects and re-opening the scene only 2 of the 4 viewports render the objects correctly. I ran the code for measuring the max size of the largest object in the scene again after having deleted all extremely large objects to check if I had indeed not skipped over one of these large objects, the result was a value of: 121.28 [generic units] (corresponding to object: "Mesh_3598") which is a relatively normal size, however 2 of my 4 viewports are not rendering my objects even after deleting the large objects (only when the left orthographic view is selected they can be seen in the 2 viewports that do not render part of the time).
Code for checking largest object (also prints out maximum size of this object):
global_max=0
largest_obj=undefined
for obj in geometry do(
obj_max_x = (obj.max.x-obj.min.x)
obj_max_y = (obj.max.y-obj.min.y)
obj_max_z = (obj.max.z-obj.min.z)
local_max = amax(#(obj_max_x, obj_max_y, obj_max_z ))
if local_max > global_max do
global_max = local_max ; largest_obj = obj
)
messagebox ("global max = " + global_max as string)
messagebox ("largest obj = " + largest_obj as string)
See the following links for the 3ds max scene files I have mentioned:
https://drive.google.com/open?id=1bAilmaHAXDr4WuD8gGS4piQfPzzJM9MH
Any suggestions/help will be greatly appreciated. Thank you very much!
System specs:
-Autodesk 3ds max 2018 x64
-Windows 10 PRO x64
-i5 6600k #3.5ghz
-msi z170a gaming m7 - socket 1151 - atx
-coolermaster g750m -750watt
-msi radeon r9-390x gaming -8gb
-noctua NH-D15
-kingston hyper-x fury black 16gb-pc-21300-dimm-4x4gb#2666mhz
As it turns out the extremely large objects were indeed causing the viewport rendering error. After removing all object's with a maximum size of 100000 [generic units] the viewport rendering errors were gone. I suspect the issue was caused by the objects not being in between the viewport's far and near planes due to the extremely large object sizes.