User avatar
z0rg77
Topic Author
Posts: 15
Joined: 01 Apr 2020, 10:33
Contact:

Potential bug WebGL RenderTexture bound to texture source

21 Nov 2020, 13:48

Hi there,

I'm developping my website using Noesis in Unity and one of the things I'm trying to achieve is to draw a shader as the application background.

The approach is simple, on unity side, I'm calling
Graphics.Blit(null, RenderTexture, mat);
where mat is a material with my custom shader.

On Noesis Side I'm binding the "Source" property to a TextureSource available through the DataContext.
Here's what the DataContext side looks like :
        public TextureSource RenderTextureShader
        {
            get
            {
                // Access the render texture through a singleton (which is not null)
                // I explicitely specified script execution order to prevent this
                var rt = AppServices.Instance.ShaderBackgroundRT; 
                var tex = Noesis.Texture.WrapTexture(rt);
                return new TextureSource(rt);
            }
        }
This works perfectly well in the editor, but when I build to WebGL, it only works on integrated GPU.
When I turn on hardware acceleration on the browser (I tested Chrome firefox and edge the behaviour is the same), the texture looks like transparent, I see the background color of the grid that contains the image (on the shader side the return alpha is always 1.0.

To be sure that it wasn't a problem on the unity side, I simply decided to display the render texture with an unlit shader on a quad and this shows up correctly in all scenarios.

I thought it might be an issue with the Graphics API not working consistently between integrated and dedicated so I switched to WebGL 1 (GL ES 2.0) but I have the same issue.

To resume :
- RenderTexture seems transparent
- Works fine in editor and on integrated GPU, fails on dedicated GPU
- Same behaviour between edge, firefox, chrome
- Same behaviour on 2 pc (intel/nvidia)
- Same behaviour on unity 2019.3, 2019.4
- Same behaviour on WebGL 1.0 and 2.0
- No error/warning message in console

How could I fix this ?

Thanks in advance !
 
User avatar
jsantos
Site Admin
Posts: 3939
Joined: 20 Jan 2012, 17:18
Contact:

Re: Potential bug WebGL RenderTexture bound to texture source

24 Nov 2020, 19:14

This is weird, are you sure the texture doesn't contain pixels with alpha < 1.0 ? As a test, could you disable the Blit and instead use a static texture with the same content (grabbing the output of your shader in a texture at editor time).

By the way, in the latest version of Noesis you can directly expose Unity textures in the data context.
 
User avatar
z0rg77
Topic Author
Posts: 15
Joined: 01 Apr 2020, 10:33
Contact:

Re: Potential bug WebGL RenderTexture bound to texture source

25 Nov 2020, 10:22

Ok so here's the fragment code
 	fixed4 frag (v2f i) : SV_Target
            {
				fixed2 uv = i.uv;
				fixed3 col;
				uv -= fixed2(0.5, 0.5);
				uv *= 5.;
				
				col = rdr(uv);

                return fixed4(col.x,col.y, col.z, 1.0);
            }
I've seen a warning in the console (browser) which I didn't notice at first, it appears when it does not work, so the issue is certainly there :
413161aa-968e-457e-8e89-e89e49cd241e:8 WebGL: INVALID_ENUM: getInternalformatParameter: invalid internalformat
_glGetInternalformativ @ 413161aa-968e-457e-8e89-e89e49cd241e:8


Binding to a RenderTexture without the Blit results in the same behaviour (warning up here).
Binding to a regular unity texture throws the same warning but the texture shows

Binding to the RenderTexture does the same thing, but thanks for the tip !

The culprit of the warning is there (as told by the browser) :
        function _glGetInternalformativ(target, internalformat, pname, bufSize, params) {
            if (bufSize < 0) {
                GL.recordError(1281);
                return
            }
            var samples = GLctx["getInternalformatParameter"](target, internalformat, 32937); // This is the one that causes the warning
            if (!samples) {
                GL.recordError(1280);
                return
            }
            switch (pname) {
            case 32937:
            ...
but I don't know if it's emscripten generated code or Noesis land.
 
User avatar
jsantos
Site Admin
Posts: 3939
Joined: 20 Jan 2012, 17:18
Contact:

Re: Potential bug WebGL RenderTexture bound to texture source

25 Nov 2020, 11:06

Binding to a RenderTexture without the Blit results in the same behaviour (warning up here).
Binding to a regular unity texture throws the same warning but the texture shows
Do you know what format is this?
 
User avatar
z0rg77
Topic Author
Posts: 15
Joined: 01 Apr 2020, 10:33
Contact:

Re: Potential bug WebGL RenderTexture bound to texture source

25 Nov 2020, 12:21

The RenderTexture is R8G8B818_UNorm.
The Texture is RGB8 UNorm.

I don't know what this specific value means but I could investigate if it's what you're asking.
 
User avatar
jsantos
Site Admin
Posts: 3939
Joined: 20 Jan 2012, 17:18
Contact:

Re: Potential bug WebGL RenderTexture bound to texture source

25 Nov 2020, 12:25

Formats without alpha have been problematic in the past, could you try the same format but with alpha (like R8G8B8A8 or something like that)?
 
User avatar
z0rg77
Topic Author
Posts: 15
Joined: 01 Apr 2020, 10:33
Contact:

Re: Potential bug WebGL RenderTexture bound to texture source

25 Nov 2020, 12:30

Woupsie, typo error there is alpha, RenderTexture's type is R8G8B8A8_UNorm.
 
User avatar
jsantos
Site Admin
Posts: 3939
Joined: 20 Jan 2012, 17:18
Contact:

Re: Potential bug WebGL RenderTexture bound to texture source

25 Nov 2020, 12:32

Ops, yes sorry. What about the texture?
 
User avatar
z0rg77
Topic Author
Posts: 15
Joined: 01 Apr 2020, 10:33
Contact:

Re: Potential bug WebGL RenderTexture bound to texture source

25 Nov 2020, 12:50

The texture is RGB8 UNorm as told by unity inspector.

I did a test that fixed the issue, I did not see that the Depth buffer was ticked for the RenderTexture, setting it to No depth buffer made it work even if the warning is still there.

Don't know what's going on but I'm not writing to the Depth buffer at all in the shader. Do you use depth information to display the Textures ?
 
User avatar
jsantos
Site Admin
Posts: 3939
Joined: 20 Jan 2012, 17:18
Contact:

Re: Potential bug WebGL RenderTexture bound to texture source

25 Nov 2020, 12:58

When Noesis is being rendered to a RenderTexture you need to activate Depth buffer because that also enables Stencil buffer and we use stencil for applying masks.

But in this case, yes, you don't need it at all. Not sure what Unity is doing when sampling a RenderTarget with Depth but definitely this is what is causing the error.

Who is online

Users browsing this forum: Ahrefs [Bot] and 3 guests