gl_PointCoord in OSX 10.7 Lion - osx-lion

I've come against a really bizarre shader error in many of the particle examples for Three.js. In the latest Chrome and Firefox in OSX 10.7, gl_PointCoord returns (1,0) for every single fragment. This is found in a few of the included examples as well. Specifically, the Particle Billboards example, the Particle Billboard colors example, and the Particle Sprites examples do not work. They render nothing to the screen. However, the Custom Attributes example works fine, despite having almost the exact same code.
I found this bug while running compatibility checks on my own software. I wrote a shader which renders text sprites using particles. This works fine in 10.6, Windows, and Linux. However, 10.7 renders nothing. Further debugging by setting gl_FragCOlor = vec4(gl_PointCoord, 0., 1.); revealed that all the particles get rendered as solid red squares, instead of the expected red-green gradient.
I've been trying to determine what exactly is the difference between the examples that make Custom Attributes work, but not Particle Billboards, but I haven't made very much headway yet. This is exacerbated by the lack of a testing platform, aside from borrowing friends' laptops. Is this a known OS bug?

Related

URP Shader Graph shaders time node doesn't work in certain situations

I have several shaders that use the Time node to animate certain UI elements for my mobile game.
In certain cases (which I am unable to reproduce), the shaders will be stuck on a specific time and not animate.
For example, a shine shader might be stuck mid-shine.
Some important things to note:
Time.timeScale is 1 - all my tweens using scaled delta time are working correctly
When this happens all shaders using the time node in the active scene will "freeze in time"
This only reproduces on actual Android devices (iOS untested)
All parameters affecting time in said shaders are positive, valid numbers
There are no errors or warnings
When I transition from my main menu scene to the game scene, time based shaders will work correctly - when I come back to the main scene, it will reproduce again (obviously it's something specifically there)
Running on Unity 2019.4.11f1 with URP & Shader Graph 7.5.1, but it did reproduce in older versions of Unity/URP as well
Sample shader - The preview actually looks exactly like the issue as it will appear in game when frozen:
For any lost souls who stumble upon this issue (although I doubt it's by design as it seems like a bug).
The problem is that time isn't updated in shaders in scenes without a camera.
The camera doesn't have to render anything; it just needs to exist.
My main menu is pure UI elements and had no camera; adding a camera to the scene fixed the issue.
It is important to note that this behavior only happens once you build to an Android device; not sure if it reproduces on other platforms.

LWRP Shaders not working in android! only working in Unity

I am new to Unity game development. I making a mobile game. In LWRP I made a custom shader graph (glow shader). It works fine in unity in my pc. But when I build a .APK and install and play on my android device it shows a plain block with no shaders.
Am I missing something?
This my shader graph.
After clear inspection, I noticed most of the shader works. Only the occlusion part doesn't work properly. I have set occlusion to 5. Normally what occlusion does in my graph is kind off saturates the material a little and gives a deep reflective glow kinda effect to the shader. It works on PC. But on the phone, I noticed the color did get saturated a little but didn't notice the occlusion glow at all.
(I know I can produce nearly similar results without using occlusion, but I really felt occlusion suits more for my taste :P and I will be using occlusion for other objects too)
quality settings:
Check your Graphics API in player setting, it should be under Other Settings

Unity Mobile depth buffer precision

I'm a mobile game developer using Unity Engine.
Here is my problem:
I tried to render the static scene stuffs into a render target with color buffer and depth buffer, with which i render to the following frames before the dynamic objects are rendered if the game main player's viewpoint stays the same. My goal is to reduce some of draw calls as well as to save some power for mobile devices. This strategy saves up to 20% power in our MMO mobile game consumption on android devices FYI.
The following pics are screen shot from my test project. The sphere,cube and terrain are static objects, and the red cylinder is moving.
you can see the depth test result is wrong on android.
iOS device works fine, The depth test is right, and the render result is almost the same as the optimization is off. Notice that the shadow is not right but we ignore it for now.
However the result on Android is not good. The moving cylinder is partly occluded by the cube and the occlusion is not stable between frames.
The results seem that the depth buffer precision is not enough. Any ideas about this problem?
I Googled this problem, but no straight answers. Some said we cant read depth buffer on GLES.
https://forum.unity.com/threads/poor-performance-of-updatedepthtexture-why-is-it-even-needed.197455/
And then there are cases where platforms don't support reading from the Z buffer at all (GLES when no GL_OES_depth_texture is exposed; or Direct3D9 when no INTZ hack is present in the drivers; or desktop GL on Mac with some buggy Radeon drivers etc.).
Is this true?

Unity 3D - Particles suddenly becoming transparent

I'm developing a 3D game using Unity3D 4.5.2 (free version, not Pro).
I have used a default Particle System to make a Waterfall. I have placed the Waterfall particle system in the scene in such a way that there is a 2D Sprite of a Mountain behind it.
This is to give an impression to the user that the Waterfall is 'falling' out from the Mountain.
However, after 10 seconds on simulating this waterfall particle system, the particles suddenly become transparent, enabling the user to view the Mountain behind it...have provided screenshots below:
So I would really appreciate if anyone could help me out here, as I've looked at a lot of solutions & fiddled around with all the parameters of the particle system in Inspector but to no avail...
To give you the correct answer for this question, I need see your particle system parameters. But I think something is wrong with "Max Particles" amount or "Color over lifetime". Also check "Start Lifetime", maybe it is too much.

Matlab select Graphics Hardware Acceleration

Somehow the OpenGL renderer of my matlab (2010b) does not work correctly, meaning, the plots are weird compositions of patches instead of the smooth surface I actually generate. When changing the renderer to zbuffer it works fine, however, I miss the possibility to use transparency. Moreover, I get the feeling that using opengl I would experience a performance boost, such that rotating and zooming and stuff would happen quicker.
I think the reason lies in my computer hardware. I got an intel HD graphics 4600 plus a NIVIDIA Quadro K610M/PCIe/SSE2. When typing "opengl info" , that second graphics card is listed. I already updated the driver but nothing changed.
Any Ideas what may be the problem?
Own solution idea: I would like to test the Intel HD graphics but didnt find a way to set it as the default accelerator when running Matlab. Do you know how to do that?
To switch the hardware used:- You can choose the hardware support in NVIDIA control panel, as shown below in the link.
Restart MATLAB and check using "opengl info" command. The 'Vendor' should have switched to the integrated processor type.
To get your smooth surfaces, you can run
opengl software
upon starting matlab. This doesn't solve your problem (I cannot run opengl hardware on my 64bit ubuntu HD4400 graphics), but might be a good workaround for the moment.
Also this way you know it is the opengl, not something else.
No idea about how to switch the chosen graphics card though, sorry.