Does the Android emulator support OpenGL ES 2.0? I've seen some people say "Yes, but you have to change a few settings." and I've also seen "No, it doesn't support it, period." Here's what I've done to try and correct the problem, including some error messages that I got.
First, I modified the AndroidManifest.xml to contain the following code:
<uses-feature
android:glEsVersion="0x00020000" />
<uses-sdk
android:minSdkVersion="15"
android:targetSdkVersion="17" />
Then, when I want to instantiate my GLSurfaceView, I use this sequence of code to instantiate it:
super(context);
setEGLContextClientVersion(2);
setRenderer(new MyRenderer());
setRenderMode(GLSurfaceView.RENDERMODE_WHEN_DIRTY);
Then, everywhere I looked said that you must go into the AVD Manager, select the emulator, go to "Hardware", add "GPU emulation" and set the boolean to "yes". However, here is what I see when I look at mine:
What's peculiar is that I have another emulator in my AVD Manager of which I do have the "Hardware" table:
And just to show you exactly what I'm doing, here's some code that does some stuff I want to do in OpenGL ES 2.0 (I mainly got this from Android's own tutorials):
int vertexShader = loadShader(GLES20.GL_VERTEX_SHADER, vertexShaderCode);
int fragmentShader = loadShader(GLES20.GL_FRAGMENT_SHADER, fragmentShaderCode);
program = GLES20.glCreateProgram();
GLES20.glAttachShader(program, vertexShader);
GLES20.glAttachShader(program, fragmentShader);
GLES20.glLinkProgram(program);
I don't want to change my code back to work with OpenGL ES 1.0 because that will require a lot of headaches and if I can avoid it, I will.
Finally, when I try running my program, the program closes with the window: "Unfortunately, has stopped." This is what LogCat told me:
12-05 06:16:27.165: E/AndroidRuntime(936): FATAL EXCEPTION: GLThread 81
12-05 06:16:27.165: E/AndroidRuntime(936): java.lang.IllegalArgumentException: No config chosen
12-05 06:16:27.165: E/AndroidRuntime(936): at android.opengl.GLSurfaceView$BaseConfigChooser.chooseConfig(GLSurfaceView.java:874)
12-05 06:16:27.165: E/AndroidRuntime(936): at android.opengl.GLSurfaceView$EglHelper.start(GLSurfaceView.java:1024)
12-05 06:16:27.165: E/AndroidRuntime(936): at android.opengl.GLSurfaceView$GLThread.guardedRun(GLSurfaceView.java:1401)
12-05 06:16:27.165: E/AndroidRuntime(936): at android.opengl.GLSurfaceView$GLThread.run(GLSurfaceView.java:1240)
I can say Yes on your question.
Android emulator supports OpenGL ES 2.0.
I created an app with cocos2d-x v.2 (which uses OpenGL ES 2.0).
I had same FATAL EXCEPTION: GLThread 81 error with same stack.
I solved this issue by adding
gLSurfaceView.setEGLConfigChooser(8 , 8, 8, 8, 16, 0);
before setting renderer setRenderer:
gLSurfaceView.setCocos2dxRenderer(new Cocos2dxRenderer());
Now I can run my app on Android emulator.
See my question and answer at https://stackoverflow.com/a/13719983/307547.
My post on this link contains screenshot with AVD settings:
http://www.cocos2d-x.org/boards/6/topics/12563?r=19274#message-19274
I just fixed the problem without adding any new lines to my source code. In the avd-manager i set "Use Host GPU" for my emulator device. Works now perfectly fine on my Geforce GTX 570.
API Level on emulator device is 16, and min-SDK in Manifest is 15.
Got the same problem. Original Buggy Code:
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
Log.d(this.getClass().getName(), "Into onCreate Draw triangle");
getWindow().setFlags(WindowManager.LayoutParams.FLAG_FULLSCREEN, WindowManager.LayoutParams.FLAG_FULLSCREEN);
requestWindowFeature(Window.FEATURE_NO_TITLE);
if (detectOpenGLES20()){
Log.d("GLES20", "GL ES 2.0 Supported..............!");
} else {
Log.d("GLES20", "GL ES 2.0 Not Supported...............!");
}
view = new GLSurfaceView(this);
view.setEGLContextClientVersion(2);
view.setEGLConfigChooser(true);
view.setRenderer(new TriangleRenderer(view));
setContentView(view);
}
Solved by :
a.) Replacing this code line
view.setEGLConfigChooser(true);
with
view.setEGLConfigChooser(8, 8, 8, 8, 16, 0);
b.) Setting -gpu on via Eclipse --> Run as ---> Target ---> Additional Emulator Command Line Options
Adding a little more to the above discussion:
There were two different Exception messages I came across while working with the above piece of code
FATAL EXCEPTION: GLThread 75
java.lang.IllegalArgumentException: No configs match configSpec
and
java.lang.IllegalArgumentException: No config chosen
A more detailed case study narrated at http://on-android-opengl2.blogspot.in/2013/05/android-opengl-es-20-emulator.html
Related
I have posted the same question over Unity's forum, but there hasn't been any answer and thus posting it here, too.
I have been trying to run a Unity WebGL build in headless mode (through puppeteer) while saving 'screenshots' of the game, but the camera rendering doesn't seem to be working. The resulting images are all black.
It works as expected when not in headless mode (but still WebGL).
It also works properly in standalone builds (e.g., windows, mac), through -batchMode.
Here's the code in question:
// Problem seems to be in the following 2 lines
RenderTexture.active = camera.targetTexture;
camera.Render();
// same dimensions, both in headless and not headless
Debug.Log("CAMERA TARGET TEXTURE WIDTH: " + camera.targetTexture.width);
Debug.Log("CAMERA TARGET TEXTURE HEIGHT: " + camera.targetTexture.height);
tempTexture2D = new Texture2D(camera.targetTexture.width, camera.targetTexture.height, TextureFormat.RGB24, false);
tempTexture2D.ReadPixels(new Rect(0, 0, camera.targetTexture.width, camera.targetTexture.height), 0, 0);
tempTexture2D.Apply();
// RGBA(0.000, 0.000, 0.000, 1.000): totally black, when in WebGL headless mode. Works fine otherwise.
Debug.Log(tempTexture2D.GetPixels(100, 100, 1, 1)[0].ToString());
// Encode texture into JPG
byte[] bytes = tempTexture2D.EncodeToJPG();
// byte count is almost half when in headless mode
Debug.Log("IMG " + frameNumber + " byte count: " + bytes.Length);
// save to persistentData (indexedDB in WebGL)
// that data is then read on client side and encoded again
I found some differences between the Webgl headfull and headless versions (respective pictures below).
I have also tried to set --use-gl=swiftshader which has better gpu stats, but still shows everything in black:
To be clear, the arguments I'm passing to chromium are the following:
args:[
'--headless',
'--hide-scrollbars',
'--mute-audio',
'--no-sandbox',
'--use-gl=swiftshader' // tested with and without
]
The headless log output coming from unity is the following:
PAGE LOG: Loading player data from data.unity3d
PAGE LOG: Initialize engine version: 2018.4.10f1 (a0470569e97b)
PAGE LOG: Creating WebGL 2.0 context.
PAGE LOG: Renderer: WebKit WebGL
PAGE LOG: Vendor: WebKit
PAGE LOG: Version: OpenGL ES 3.0 (WebGL 2.0 (OpenGL ES 3.0 Chromium))
PAGE LOG: GLES: 3
PAGE LOG: EXT_color_buffer_float GL_EXT_color_buffer_float EXT_float_blend GL_EXT_float_blend EXT_texture_filter_anisotropic GL_EXT_texture_filter_anisotropic OES_texture_float_linear GL_OES_texture_float_linear WEBGL_compressed_texture_etc GL_WEBGL_compressed_texture_etc WEBGL_compressed_texture_etc1 GL_WEBGL_compressed_texture_etc1 WEBGL_compressed_texture_s3tc GL_WEBGL_compressed_texture_s3tc WEBGL_debug_renderer_info GL_WEBGL_debug_renderer_info WEBGL_debug_shaders GL_WEBGL_debug_shaders WEBGL_lose_context GL_WEBGL_lose_context
PAGE LOG: OPENGL LOG: Creating OpenGL ES 3.0 graphics device ; Context level <OpenGL ES 3.0> ; Context handle 1
PAGE LOG: UnloadTime: 0.340000 ms
PAGE LOG: [.WebGL-0x7fcdb69e1600]GL ERROR :GL_INVALID_OPERATION : glFramebufferTexture2D: <- error from previous GL command
PAGE LOG: [.WebGL-0x7fcdb69e1600]GL ERROR :GL_INVALID_OPERATION : GetShaderiv: <- error from previous GL command
PAGE LOG: WebGL: INVALID_OPERATION: renderbufferStorageMultisample: samples out of range
PAGE LOG: [.WebGL-0x7fcdb69e1600]GL ERROR :GL_INVALID_FRAMEBUFFER_OPERATION : glClear: framebuffer incomplete
PAGE LOG: [.WebGL-0x7fcdb69e1600]GL ERROR :GL_INVALID_FRAMEBUFFER_OPERATION : glDrawElements: framebuffer incomplete
PAGE LOG: [.WebGL-0x7fcdb69e1600]GL ERROR :GL_INVALID_FRAMEBUFFER_OPERATION : glDrawArrays: framebuffer incomplete
PAGE LOG: [.WebGL-0x7fcdb69e1600]GL ERROR :GL_INVALID_FRAMEBUFFER_OPERATION : glBlitFramebufferCHROMIUM: framebuffer incomplete
Could the problem be the WebGL hardware acceleration altogether? Can I disable it from my WebGL build?
This issue seems to be related with: Rendering WebGL image in headless chrome without a GPU
But that seems to be working properly, at least under MacOS:
https://github.com/Apidcloud/WebGLHeadlessRendering
So I'm still assuming it has something to do with Unity3D. Even if headfull, it will become black when using swift shader, with the only gpu stats difference being video decode--hardware acceleration disabled:
Thanks!
Edit with answer (more details on the actual answer below):
It finally works by disabling the anti-aliasing in unity, which seems to throw some OpenGL errors. Works with and without swift shader.
After 2 or 3 days struggling with this and trying to find the exact problem, it finally works by disabling the anti-aliasing in unity, which seems to throw some OpenGL errors.
That said, the Unity3D WebGL rendering works headlessly (through chromium and puppeteer) even without swift shader, though that is probably what you want for no-gpu scenarios (e.g., some server).
The log output, though still weird, for both scenarios (with and without swift shader) is the following (note that the framebuffer incomplete errors are gone):
PAGE LOG: [.WebGL-0x7f842a0f6400]GL ERROR :GL_INVALID_OPERATION : glFramebufferTexture2D: <- error from previous GL command
PAGE LOG: [.WebGL-0x7f842a0f6400]GL ERROR :GL_INVALID_OPERATION : GetShaderiv: <- error from previous GL command
Thanks!
i'm new to Xamarin Forms and I'm not able to undestand why this line of code gives me an 'AccessDenied' Socket exception:
tcpAsyCl.SetSocketOption(SocketOptionLevel.Socket, SocketOptionName.NoDelay, 1);
If instead, I use:
tcpAsyCl.NoDelay = true;
the application seems to work properly!
Others SocketOption like:
tcpAsyCl.SetSocketOption(SocketOptionLevel.Socket, SocketOptionName.ReceiveTimeout, _timeout);
doesn't give me the same exception. My actual test configuration is this:
Win7; VS2019; Simulation with Tablet Android 8.0 API 26
That's because the SocketOptionLevel.Socket should be SocketOptionLevel.Tcp as stated here:
I am playing with the matrixmultiplication project downloadable from the bottom of the site:
http://blogs.msdn.com/b/nativeconcurrency/archive/2011/11/02/matrix-multiplication-sample.aspx
When I change the values of M, N, W from 256 to 4096, an unhandled exception is thrown:
Unhandled exception at 0x7630C42D in MatrixMultiplication.exe: Microsoft C++ exception: Concurrency::accelerator_view_removed at memory location 0x001CE2F0.
The console output is:
Using device: NVIDIA GeForce GT 640M
MatrixDiemnsion C(4096x4096) = A(4096x4096) * B(4096x4096)
CPU(single core) exec completed.
AMP Simple
The next statement to be executed is leaving the function mxm_amp_simple.
I am using VS2013 Ultimate on Windows 7 Professional N.
Why does this occur and how to prevent this from happening?
EDIT: I have found that the greatest value for M,N,W with which AMP Simple does not lead to a breakpoint being hit is 2800 (M=2800, N=2800, W=2800).
AMP Tiled on the other hand sometimes leads to a breakpoint, and in other cases executes correctly for M,N,W equal to 4096.
The exception is accompanied by a system error message:
"Display driver stopped responding and has recovered. Display driver NVIDIA Windows Kernel Mode Driver, Version 331.65 stopped responding and has successfully recovered."
In case someone else needs this.
This issue is most likely caused by Timeout Detection and Recovery (TDR). If kernel runs for more then 2 seconds windows will kill it and throw Concurrency::accelerator_view_removed exception. The easiest way to check this is to wrap code in try / catch bock. E.g.
try {
av_c.synchronize();
} catch (const Concurrency::accelerator_view_removed& e) {
printf("%s\n", e.what());
}
Microsoft has a blog post with more information, including pointers to instructions how to disable it.
I wrote an apk to test camera on Android 4.2.2 before. This apk works fine.
However, when I moved this apk to Android 4.4.
I got a problem with Camera::connect().
Fail to call Camera::connect() and it prints message:
W/AppOps ( 1546): Bad call: specified package TestCamera under uid 1000 but it is really -1
I think the reason may be USE_CALLING_UID, security or something that I can't figure out.
Please give me some suggestions, thanks!
My apk is very simple, only one activity. In onCreate(), I called a jni function.
The jni function just do the code belowed:
int cameraId = 0;
String16 clientPackageName("TestToGoService");
sp<Camera> camera = Camera::connect(cameraId, clientPackageName, Camera::USE_CALLING_UID);
if (camera == NULL) {
ALOGE("camera==NULL.");
return -1;
}
ALOGV("camera=%p.",camera.get());
Try:
If I put the code above to a executable (main()), then Camera::connect() works OK.
I have already add permissons on AndroidManifest.xml
Thanks again!
I'm not sure if it's still of any help. I had the same error in the past. The problem is clientPackageName, that has to be set to the exact package name of your application (which must have the proper camera permissions set on the manifest).
After trying really hard i am posting this qustion in the 2 o clock of night in my office.
The problem is :
1) i have included Kal calender (link here) in my application
2) And its really working fine before i decided to include xmpp framework(xmpp framework) in my application.
3) The Main problem is when i try to include libidn.a file and then compile the project it gives me 4 errors , and to remove these error i have to remove "Other linker flag -> -all_load".
4) Here the problem begins when i have removed -all_laod flag and compile , app compiles success fully. But i try to run my app and press calendar button to load calender it crashes with following error:
-[__NSDate cc_dateByMovingToFirstDayOfTheMonth]: unrecognized selector sent to instance 0x75b85c0 2012-06-12 01:38:47.483 BizPro[10251:11903]
* Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[__NSDate
cc_dateByMovingToFirstDayOfTheMonth]: unrecognized selector sent to
instance 0x75b85c0'
* First throw call stack: (0x209e022 0x22f0cd6 0x209fcbd 0x2004ed0 0x2004cb2 0x12bc3d 0x12bb91 0x13149e 0x1315f6 0x12961 0xa8d38f
0xa8d5eb 0xa9dff1 0xa9e85f 0xa9e9e1 0xbbc5c2 0xa02d21 0x209fe42
0x856679 0x860579 0x7e54f7 0x7e73f6 0x874160 0x9d4f30 0x207299e
0x2009640 0x1fd54c6 0x1fd4d84 0x1fd4c9b 0x26e67d8 0x26e688a 0x9c4626
0x2a9d 0x2a15 0x1) terminate called throwing an exception(lldb)
4) I know very well that this error is a misguide (NSDate is not causing the crash but the collision of static libraries is doing it , i think so) , because when i again add -all_load flag and remove the libidn.a file from my project , It compile and RUN successfully and calendar displays my data smoothly.
i googled a lot about it .. and got very little guidance .. relating the solution to workspaces and all that .. But i really dont know what could be the solution .. Plz help me
Thanks
cc_dateByMovingToFirstDayOfTheMonth
is an addition to NSDate that is defined in NSDateAdditions.h from the Kal Framework.
I had no end of problems and wanted more customisation of the Kal framework so I just dragged all the source code into my project.
If you still have issues with the framework then remove Kal.a and just bring in the source code :) then you can get dirty with it
If you rather keep Kal as a subproject, the way to fix this is to link the addition files.
Under
Project -> Build Settings -> Linking -> Other Linker Flags
add '-all_load'
sorry i am answering late ..
i solved the problem ..
removed kal.a file and just included src folder (all the header files required)
so that i will not collide with the other static library ...
Similar problem here. Because I'm using Parse Framework, can't add -all_load, so integrating static library will never work for me. The simplest way is to add all source files in src folder to my own project, rather than adding Kal.xcodeproj. Other steps to config "copy bundle files" or "header search path" etc are still necessary. I also need to add these lines from Kal_Prefix.pch to my own project.pch.
#import "NSDate+Convenience.h"
#define RGBCOLOR(R,G,B) [UIColor colorWithRed:R/255.0 green:G/255.0 blue:B/255.0 alpha:1]
#define RGBACOLOR(R,G,B,A) [UIColor colorWithRed:R/255.0 green:G/255.0 blue:B/255.0 alpha:A]
#define kDarkGrayColor RGBCOLOR(51, 51, 51)
#define kGrayColor RGBCOLOR(153, 153, 153)
#define kLightGrayColor RGBCOLOR(185, 185, 185)