Unity3D WebGL Headless not rendering - unity3d

I have posted the same question over Unity's forum, but there hasn't been any answer and thus posting it here, too.
I have been trying to run a Unity WebGL build in headless mode (through puppeteer) while saving 'screenshots' of the game, but the camera rendering doesn't seem to be working. The resulting images are all black.
It works as expected when not in headless mode (but still WebGL).
It also works properly in standalone builds (e.g., windows, mac), through -batchMode.
Here's the code in question:
// Problem seems to be in the following 2 lines
RenderTexture.active = camera.targetTexture;
camera.Render();
// same dimensions, both in headless and not headless
Debug.Log("CAMERA TARGET TEXTURE WIDTH: " + camera.targetTexture.width);
Debug.Log("CAMERA TARGET TEXTURE HEIGHT: " + camera.targetTexture.height);
tempTexture2D = new Texture2D(camera.targetTexture.width, camera.targetTexture.height, TextureFormat.RGB24, false);
tempTexture2D.ReadPixels(new Rect(0, 0, camera.targetTexture.width, camera.targetTexture.height), 0, 0);
tempTexture2D.Apply();
// RGBA(0.000, 0.000, 0.000, 1.000): totally black, when in WebGL headless mode. Works fine otherwise.
Debug.Log(tempTexture2D.GetPixels(100, 100, 1, 1)[0].ToString());
// Encode texture into JPG
byte[] bytes = tempTexture2D.EncodeToJPG();
// byte count is almost half when in headless mode
Debug.Log("IMG " + frameNumber + " byte count: " + bytes.Length);
// save to persistentData (indexedDB in WebGL)
// that data is then read on client side and encoded again
I found some differences between the Webgl headfull and headless versions (respective pictures below).
I have also tried to set --use-gl=swiftshader which has better gpu stats, but still shows everything in black:
To be clear, the arguments I'm passing to chromium are the following:
args:[
'--headless',
'--hide-scrollbars',
'--mute-audio',
'--no-sandbox',
'--use-gl=swiftshader' // tested with and without
]
The headless log output coming from unity is the following:
PAGE LOG: Loading player data from data.unity3d
PAGE LOG: Initialize engine version: 2018.4.10f1 (a0470569e97b)
PAGE LOG: Creating WebGL 2.0 context.
PAGE LOG: Renderer: WebKit WebGL
PAGE LOG: Vendor: WebKit
PAGE LOG: Version: OpenGL ES 3.0 (WebGL 2.0 (OpenGL ES 3.0 Chromium))
PAGE LOG: GLES: 3
PAGE LOG: EXT_color_buffer_float GL_EXT_color_buffer_float EXT_float_blend GL_EXT_float_blend EXT_texture_filter_anisotropic GL_EXT_texture_filter_anisotropic OES_texture_float_linear GL_OES_texture_float_linear WEBGL_compressed_texture_etc GL_WEBGL_compressed_texture_etc WEBGL_compressed_texture_etc1 GL_WEBGL_compressed_texture_etc1 WEBGL_compressed_texture_s3tc GL_WEBGL_compressed_texture_s3tc WEBGL_debug_renderer_info GL_WEBGL_debug_renderer_info WEBGL_debug_shaders GL_WEBGL_debug_shaders WEBGL_lose_context GL_WEBGL_lose_context
PAGE LOG: OPENGL LOG: Creating OpenGL ES 3.0 graphics device ; Context level <OpenGL ES 3.0> ; Context handle 1
PAGE LOG: UnloadTime: 0.340000 ms
PAGE LOG: [.WebGL-0x7fcdb69e1600]GL ERROR :GL_INVALID_OPERATION : glFramebufferTexture2D: <- error from previous GL command
PAGE LOG: [.WebGL-0x7fcdb69e1600]GL ERROR :GL_INVALID_OPERATION : GetShaderiv: <- error from previous GL command
PAGE LOG: WebGL: INVALID_OPERATION: renderbufferStorageMultisample: samples out of range
PAGE LOG: [.WebGL-0x7fcdb69e1600]GL ERROR :GL_INVALID_FRAMEBUFFER_OPERATION : glClear: framebuffer incomplete
PAGE LOG: [.WebGL-0x7fcdb69e1600]GL ERROR :GL_INVALID_FRAMEBUFFER_OPERATION : glDrawElements: framebuffer incomplete
PAGE LOG: [.WebGL-0x7fcdb69e1600]GL ERROR :GL_INVALID_FRAMEBUFFER_OPERATION : glDrawArrays: framebuffer incomplete
PAGE LOG: [.WebGL-0x7fcdb69e1600]GL ERROR :GL_INVALID_FRAMEBUFFER_OPERATION : glBlitFramebufferCHROMIUM: framebuffer incomplete
Could the problem be the WebGL hardware acceleration altogether? Can I disable it from my WebGL build?
This issue seems to be related with: Rendering WebGL image in headless chrome without a GPU
But that seems to be working properly, at least under MacOS:
https://github.com/Apidcloud/WebGLHeadlessRendering
So I'm still assuming it has something to do with Unity3D. Even if headfull, it will become black when using swift shader, with the only gpu stats difference being video decode--hardware acceleration disabled:
Thanks!
Edit with answer (more details on the actual answer below):
It finally works by disabling the anti-aliasing in unity, which seems to throw some OpenGL errors. Works with and without swift shader.

After 2 or 3 days struggling with this and trying to find the exact problem, it finally works by disabling the anti-aliasing in unity, which seems to throw some OpenGL errors.
That said, the Unity3D WebGL rendering works headlessly (through chromium and puppeteer) even without swift shader, though that is probably what you want for no-gpu scenarios (e.g., some server).
The log output, though still weird, for both scenarios (with and without swift shader) is the following (note that the framebuffer incomplete errors are gone):
PAGE LOG: [.WebGL-0x7f842a0f6400]GL ERROR :GL_INVALID_OPERATION : glFramebufferTexture2D: <- error from previous GL command
PAGE LOG: [.WebGL-0x7f842a0f6400]GL ERROR :GL_INVALID_OPERATION : GetShaderiv: <- error from previous GL command
Thanks!

Related

Unity Engine: [Cam Post Process] + [outline Post Process] Color and depth buffers have mismatched Surfaces Resolved Buffer flag Error Log?

When i closing camera post process error gone (using std built in pp and custom outline pp) !
(closing camera post process and is gone)

Shader graph error - 'PBRDeferredFragment': cannot convert from 'struct v2f_surf' to 'struct SurfaceDescription'

I started using shader graph and created one shader, then I tried building my game however I get this error:
Shader error in 'Shader Graphs/blend': 'PBRDeferredFragment': cannot convert from 'struct v2f_surf' to 'struct SurfaceDescription' at /unfinishedProjects/handsignes/handsignes/Library/PackageCache/com.unity.shadergraph#12.1.8/Editor/Generation/Targets/BuiltIn/Editor/ShaderGraph/Includes/PBRDeferredPass.hlsl(145) (on d3d11)
Compiling Subshader: 0, Pass: Pass 2, Vertex program with LIGHTPROBE_SH SHADOWS_SHADOWMASK
Platform defines: SHADER_API_DESKTOP UNITY_ENABLE_DETAIL_NORMALMAP UNITY_ENABLE_REFLECTION_BUFFERS UNITY_LIGHTMAP_FULL_HDR UNITY_LIGHT_PROBE_PROXY_VOLUME UNITY_PASS_DEFERRED UNITY_PBS_USE_BRDF1 UNITY_SPECCUBE_BLENDING UNITY_SPECCUBE_BOX_PROJECTION UNITY_USE_DITHER_MASK_FOR_ALPHABLENDED_SHADOWS
Disabled keywords: DIRLIGHTMAP_COMBINED DYNAMICLIGHTMAP_ON INSTANCING_ON LIGHTMAP_ON LIGHTMAP_SHADOW_MIXING SHADER_API_GLES30 UNITY_ASTC_NORMALMAP_ENCODING UNITY_COLORSPACE_GAMMA UNITY_ENABLE_NATIVE_SHADOW_LOOKUPS UNITY_FRAMEBUFFER_FETCH_AVAILABLE UNITY_HALF_PRECISION_FRAGMENT_SHADER_REGISTERS UNITY_HARDWARE_TIER1 UNITY_HARDWARE_TIER2 UNITY_HARDWARE_TIER3 UNITY_HDR_ON UNITY_LIGHTMAP_DLDR_ENCODING UNITY_LIGHTMAP_RGBM_ENCODING UNITY_METAL_SHADOWS_USE_POINT_FILTERING UNITY_NO_DXT5nm UNITY_NO_FULL_STANDARD_SHADER UNITY_NO_SCREENSPACE_SHADOWS UNITY_PBS_USE_BRDF2 UNITY_PBS_USE_BRDF3 UNITY_PRETRANSFORM_TO_DISPLAY_ORIENTATION UNITY_UNIFIED_SHADER_PRECISION_MODEL UNITY_VIRTUAL_TEXTURING \_GBUFFER_NORMALS_OCT \_MAIN_LIGHT_SHADOWS \_MAIN_LIGHT_SHADOWS_CASCADE \_MAIN_LIGHT_SHADOWS_SCREEN \_MIXED_LIGHTING_SUBTRACTIVE \_SHADOWS_SOFT\
I tried creating new project, copying shadergraph graph file there and building a scene with just a cude which uses material with that shader. This project has been built without any errors.
That's why I think that the problem is probably some project settings. I tried comparing every line of project settings with another project but they were equal.
I don't know what is causing this problem. if you need any screenshots of settings - ask and I will send any.
Found the reason for this behaviour. Shader graph doesn't support built in renderer. Had to switch to URP instead.

Converting Bitmap to Texture2D crashes unity

I tried many ways to do this, Want to show a bitmap in a Unity image(UI) tried all the ways I can find. I can do it by saving the file to disk and then reading it but that's too costly as it needs to be done in every frame.
I checked image is loaded correctly but the problem is in converting to Texture2D and also saving it to memory stream and reading the file also crashes unity, I tried in Unity 2017 2018 2019 all crashes
tried all this ,this
*Only need to make this work in desktop Build if it helps
Bitmap bitmap = new Bitmap(#"C: \image1.jpg");
final = UnmanagedImage.FromManagedImage(bitmap);
Texture2D convertedTx;
convertedTx = Texture2D.CreateExternalTexture(final.Width, final.Height, TextureFormat.ARGB32, false, false, final.ImageData);
//Convert UnmanagedImage to Texture
convertedTx.UpdateExternalTexture(final.ImageData);
display.texture = convertedTx;
I think you didn't put this code in an enumeration, so its run on the main thread and cause unity crash

JavaFX 8 application requires system reboot before working

I created a JavaFX/1.8.0_60 app on Windows 7 64 bit with MSI/EXE native installers. The app installs correctly system-msi/user-exe level but when clicking desktop shortcut my application window is sometimes blank and I need to reboot to make it work.
Running native distributions with debug settings indicates some missing prism info/loading:
jvmArgs = ['-verbose:class']
systemProperties = ['javafx.verbose':'true', 'prism.verbose':'true']
Before reboot - window is blank
(System.out)
Prism pipeline init order: d3d sw
Using native-based Pisces rasterizer
Using dirty region optimizations
Not using texture mask for primitives
Not forcing power of 2 sizes for textures
Using hardware CLAMP_TO_ZERO mode
Opting in for HiDPI pixel scaling
Prism pipeline name = com.sun.prism.d3d.D3DPipeline
QLoading D3D native library ...
succeeded.
Direct3D initialization succeeded
(X) Got class = class com.sun.prism.d3d.D3DPipeline
Initialized prism pipeline: com.sun.prism.d3d.D3DPipeline
(System.err)
Loaded C:\Users\me\AppData\Local\myapp\runtime\lib\ext\..\..\bin\msvcr120.dll from relative path
Loaded C:\Users\me\AppData\Local\myapp\runtime\lib\ext\..\..\bin\msvcp120.dll from relative path
Loaded C:\Users\me\AppData\Local\myapp\runtime\lib\ext\..\..\bin\prism_d3d.dll from relative path
D3DPipelineManager: Created D3D9Ex device
JavaFX: using com.sun.javafx.tk.quantum.QuantumToolkit
Loaded C:\Users\me\AppData\Local\myapp\runtime\lib\ext\..\..\bin\glass.dll from relative path
vsync: true vpipe: true
Loaded C:\Users\me\AppData\Local\myapp\runtime\lib\ext\..\..\bin\javafx_font.dll from relative path
After reboot - window is correct
(System.out)
Prism pipeline init order: d3d sw
Using native-based Pisces rasterizer
Using dirty region optimizations
Not using texture mask for primitives
Not forcing power of 2 sizes for textures
Using hardware CLAMP_TO_ZERO mode
Opting in for HiDPI pixel scaling
Prism pipeline name = com.sun.prism.d3d.D3DPipeline
Loading D3D native library ...
succeeded.
Direct3D initialization succeeded
(X) Got class = class com.sun.prism.d3d.D3DPipeline
Initialized prism pipeline: com.sun.prism.d3d.D3DPipeline
OS Information:
Windows 7 build 7601
D3D Driver Information:
Intel(R) HD Graphics
\\.\DISPLAY1
Driver igdumd64.dll, version 8.15.10.2752
Pixel Shader version 3.0
Device : ven_8086, dev_0102, subsys_307017AA
Max Multisamples supported: 4
Loading Prism common native library ...
succeeded.
(System.err)
Loaded C:\Users\me\AppData\Local\myapp\runtime\lib\ext\..\..\bin\msvcr120.dll from relative path
Loaded C:\Users\me\AppData\Local\myapp\runtime\lib\ext\..\..\bin\msvcp120.dll from relative path
Loaded C:\Users\me\AppData\Local\myapp\runtime\lib\ext\..\..\bin\prism_d3d.dll from relative path
D3DPipelineManager: Created D3D9Ex device
JavaFX: using com.sun.javafx.tk.quantum.QuantumToolkit
Loaded C:\Users\me\AppData\Local\myapp\runtime\lib\ext\..\..\bin\glass.dll from relative path
Maximum supported texture size: 8192
Maximum texture size clamped to 4096
vsync: true vpipe: true
Loaded C:\Users\me\AppData\Local\myapp\runtime\lib\ext\..\..\bin\javafx_font.dll from relative path
Loaded C:\Users\me\AppData\Local\myapp\runtime\lib\ext\..\..\bin\prism_common.dll from relative path
Any ideas why graphics/prism wont load and how to diagnose further on Windows 7 so that I don't need to reboot is appreciated.
Thanks,
Paul

OpenGL ES 2.0 Support for Android?

Does the Android emulator support OpenGL ES 2.0? I've seen some people say "Yes, but you have to change a few settings." and I've also seen "No, it doesn't support it, period." Here's what I've done to try and correct the problem, including some error messages that I got.
First, I modified the AndroidManifest.xml to contain the following code:
<uses-feature
android:glEsVersion="0x00020000" />
<uses-sdk
android:minSdkVersion="15"
android:targetSdkVersion="17" />
Then, when I want to instantiate my GLSurfaceView, I use this sequence of code to instantiate it:
super(context);
setEGLContextClientVersion(2);
setRenderer(new MyRenderer());
setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
Then, everywhere I looked said that you must go into the AVD Manager, select the emulator, go to "Hardware", add "GPU emulation" and set the boolean to "yes". However, here is what I see when I look at mine:
What's peculiar is that I have another emulator in my AVD Manager of which I do have the "Hardware" table:
And just to show you exactly what I'm doing, here's some code that does some stuff I want to do in OpenGL ES 2.0 (I mainly got this from Android's own tutorials):
int vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
int fragmentShader = loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);
program = GLES20.glCreateProgram();
GLES20.glAttachShader(program, vertexShader);
GLES20.glAttachShader(program, fragmentShader);
GLES20.glLinkProgram(program);
I don't want to change my code back to work with OpenGL ES 1.0 because that will require a lot of headaches and if I can avoid it, I will.
Finally, when I try running my program, the program closes with the window: "Unfortunately, has stopped." This is what LogCat told me:
12-05 06:16:27.165: E/AndroidRuntime(936): FATAL EXCEPTION: GLThread 81
12-05 06:16:27.165: E/AndroidRuntime(936): java.lang.IllegalArgumentException: No config chosen
12-05 06:16:27.165: E/AndroidRuntime(936): at android.opengl.GLSurfaceView$BaseConfigChooser.chooseConfig(GLSurfaceView.java:874)
12-05 06:16:27.165: E/AndroidRuntime(936): at android.opengl.GLSurfaceView$EglHelper.start(GLSurfaceView.java:1024)
12-05 06:16:27.165: E/AndroidRuntime(936): at android.opengl.GLSurfaceView$GLThread.guardedRun(GLSurfaceView.java:1401)
12-05 06:16:27.165: E/AndroidRuntime(936): at android.opengl.GLSurfaceView$GLThread.run(GLSurfaceView.java:1240)
I can say Yes on your question.
Android emulator supports OpenGL ES 2.0.
I created an app with cocos2d-x v.2 (which uses OpenGL ES 2.0).
I had same FATAL EXCEPTION: GLThread 81 error with same stack.
I solved this issue by adding
gLSurfaceView.setEGLConfigChooser(8 , 8, 8, 8, 16, 0);
before setting renderer setRenderer:
gLSurfaceView.setCocos2dxRenderer(new Cocos2dxRenderer());
Now I can run my app on Android emulator.
See my question and answer at https://stackoverflow.com/a/13719983/307547.
My post on this link contains screenshot with AVD settings:
http://www.cocos2d-x.org/boards/6/topics/12563?r=19274#message-19274
I just fixed the problem without adding any new lines to my source code. In the avd-manager i set "Use Host GPU" for my emulator device. Works now perfectly fine on my Geforce GTX 570.
API Level on emulator device is 16, and min-SDK in Manifest is 15.
Got the same problem. Original Buggy Code:
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
Log.d(this.getClass().getName(), "Into onCreate Draw triangle");
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN);
requestWindowFeature(Window.FEATURE_NO_TITLE);
if (detectOpenGLES20()){
Log.d("GLES20", "GL ES 2.0 Supported..............!");
} else {
Log.d("GLES20", "GL ES 2.0 Not Supported...............!");
}
view = new GLSurfaceView(this);
view.setEGLContextClientVersion(2);
view.setEGLConfigChooser(true);
view.setRenderer(new TriangleRenderer(view));
setContentView(view);
}
Solved by :
a.) Replacing this code line
view.setEGLConfigChooser(true);
with
view.setEGLConfigChooser(8, 8, 8, 8, 16, 0);
b.) Setting -gpu on via Eclipse --> Run as ---> Target ---> Additional Emulator Command Line Options
Adding a little more to the above discussion:
There were two different Exception messages I came across while working with the above piece of code
FATAL EXCEPTION: GLThread 75
java.lang.IllegalArgumentException: No configs match configSpec
and
java.lang.IllegalArgumentException: No config chosen
A more detailed case study narrated at http://on-android-opengl2.blogspot.in/2013/05/android-opengl-es-20-emulator.html